Open Lillianwei-h opened 1 month ago
We appreciate you bringing this issue to our attention. We will conduct a thorough investigation and provide an update as soon as possible. Should we identify a bug, we will implement the necessary code changes. Thank you for your continued support.
Checklist
Describe the bug
Issue
Due to the use of PEFT, the key names of the saved weights after lora training are inconsistent with the original ones, where
language.model
becomeslanguage.base_model.model
.Fix
Before saving the weights at the end of training, I use
model.language_model = model.language_model.merge_and_unload()
and everything looks fine. I hope you can add this in future updates~Reproduction
Already fixed
Environment
Error traceback
No response