OSU-NLP-Group / LLM4Chem

Official code repo for the paper "LlaSMol: Advancing Large Language Models for Chemistry with a Large-Scale, Comprehensive, High-Quality Instruction Tuning Dataset"
https://osu-nlp-group.github.io/LLM4Chem/
MIT License
62 stars 5 forks source link

Fine tune LlaSMol-Mistral-7B #3

Closed alinzh closed 2 months ago

alinzh commented 3 months ago

I tried to save LlaSMol-Mistral-7B so I can tune it on my own dataset later, but cant understand how to do it correctly. I treid :

from generation import LlaSMolGeneration

generator = LlaSMolGeneration('osunlp/LlaSMol-Mistral-7B', device='cuda')
generator.model.save_pretrained("./saved_model")
generator.tokenizer.save_pretrained("./saved_model")

but when I start loading the model again using: model = AutoModelForCausalLM.from_pretrained("./saved_model", device_map='cuda') an error occurs in torch, it cannot read the bin file correctly.

Okay, I decided not to save and go straight to tuning, but then an error occurs with the 'meta device': Cannot copy out of meta tensor; no data!

Please, give advice on the best way to save your model for further tuning

btyu commented 3 months ago

Hi! Thanks for your interest in our model.

The model was trained with LoRA, and osunlp/LlaSMol-Mistral-7B is the LoRA adapter parameters.

If you would like to fully tune the model, you could call the load_tokenizer_and_model function here, with model_name set to osunlp/LlaSMol-Mistral-7B. In this way, you will get the model with the LoRA adapter merged into the base model Mistral. Your attempt is equal to it and should have worked. In this case, could you please provide more information about the error you got?

Instead, if you would like to only tune the LoRA adapter, you also need to use the load_tokenizer_and_model function, but removing this line before using. In this way, you will get a model with the LoRA adapter loaded onto the base model but not merged into. Fine-tuning this model would only influence the LoRA adapter parameters.

Please feel free to reach out if you have any futher questions :)

btyu commented 2 months ago

Closing this issue due to no further update. Please feel free to reopen it if needed :)