Closed pckennethma closed 1 year ago
Change the part where the model is saved, and now you can only save the lora part. You could see the peft code: https://github.com/huggingface/peft/blob/deff03f2c2/src/peft/utils/save_and_load.py
Thanks for your response!
Hello,
Thanks for creating the nice repository! May I ask the purpose of line 246 - line 251 of
deepspeed_finetune_lora.py
? I encounter an issue withsave_pretrained
rendering a surprisingly smalladapater.bin
with tens of KBs in my own codebase when using PEFT to finetune Bloom with LoRA. Could you kindly explain a bit about the code snippet? Thanks!