Closed Rain-yj closed 2 months ago
Hello! I'm facing a similar issue as well. adapter can't use on base model after training, and the error always shows size mismatch info from PeftModelForCausalLM I am using LoftQ/Llama-2-7b-hf-4bit-64rank, Is there any solution to this? Thank you!
Hello! I'm facing a similar issue as well. adapter can't use on base model after training, and the error always shows size mismatch info from PeftModelForCausalLM I am using LoftQ/Llama-2-7b-hf-4bit-64rank, Is there any solution to this? Thank you!![微信图片_20240424095327](https://github.com/yxli2123/LoftQ/assets/82367100/dbb55a66-4a7a-4356-ad1b-a3e430969022)