mbzuai-nlp / bactrian-x

A Multilingual Replicable Instruction-Following Model
92 stars 3 forks source link

LLaMA 2 support #2

Open giyaseddin opened 1 year ago

giyaseddin commented 1 year ago

Hey team, thank you for the great work on Bactrian X.

Are the trained lora weights shared in this work support the second version of Llama 2?

If not, how much resource is required to retrain it on the new model variations?

haonan-li commented 1 year ago

Hi, Sorry for the late reply, we currently do not have plan to train on Llama-2, our next step is to improve the data quality by removing the low-quality data.

Train a single lora adaptor for 7b model (1 language) can be done on 1x40GB A100, within 12 hours.