Open Egber1t opened 8 months ago
readme里面lora模型合并 from peft import AutoPeftModelForCausalLM
model = AutoPeftModelForCausalLM.from_pretrained( path_to_adapter, # path to the output directory device_map="auto", trust_remote_code=True ).eval()
merged_model = model.merge_and_unload()
里面并没有涉及pretrained_model_name_or_path?
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
当前行为 | Current Behavior
w微调成功后,进行eval和合并时都报缺少这个pretrained_model_name_or_path'
期望行为 | Expected Behavior
No response
复现方法 | Steps To Reproduce
from peft import AutoPeftModelForCausalLM
model = AutoPeftModelForCausalLM.from_pretrained( pretrained_model_name_or_path='/root/autodl-tmp/Qwen-VL-Chat', path_to_adapter='/root/autodl-tmp/Qwen-VL/output_qwen', # path to the output directory device_map="auto", trust_remote_code=True ).eval()
运行环境 | Environment
备注 | Anything else?
No response