OpenLLM-France / Lit-Claire

Continual pretraining of foundation LLM using ⚡ Lightning Fabric
GNU Affero General Public License v3.0
33 stars 1 forks source link

Question: is it compatible with quantized model? #4

Open cfrancois7 opened 8 months ago

cfrancois7 commented 8 months ago

Hi,

We're running a project in the CivicLab in Grenoble, France. The project is based on LLM and is building as opensource project. We're trying to build, test, prototype and evaluate an assistant for public consultation.

We're looking for CPT Framework and also the opportunity to make new partnerships or relationships for counselling.

At this stage I have only one 8 RAM GPU, so I'm using a mistral 7B quantized based model. Soon I'll get access to an A5 family GPU (around 24 Go) thanks to a contributor.

So my question is: is your framework is compatible with quantized model and QLora? Is it model agnostic?

Also we're interesting by new contributor to help us make this prototype comes true. It'll be a pleasure to exchange about project with the OpenLLM-France. community

Jeronymous commented 8 months ago

Hello @cfrancois7 and welcome here

is your framework is compatible with quantized model and QLora? Is it model agnostic?

In this repo, it is possible to:

AllClaire models trained using this repository can be found in several quantized forms.

I don't think QLora is supported. But we should check whether it is suppored in lit-gpt now.

Thanks for your interest. If you haven't done it I encourage you to join OpenLLM France using Discord: https://discord.gg/tZf7BR4dY7