Closed cbigeyes closed 1 month ago
你可以用以下代码把lora的checkpoint保存为标准的模型checkpoint,然后用现有代码正常加载?
from peft import AutoPeftModelForCausalLM
model = AutoPeftModelForCausalLM.from_pretrained(
'xxxx/checkpoint-1200', # path to the lora checkpoint directory
device_map="auto",
trust_remote_code=True
).eval()
print("load success")
merged_model = model.merge_and_unload()
merged_model.save_pretrained('xxxx/SeeClick-xxxx', max_shard_size="2048MB", safe_serialization=True)
print("save success")
已成功,谢谢!
通过lora finetune保存的模型,想让它继续训练,请问改如何设置