InternLM / InternLM-XComposer

InternLM-XComposer-2.5: A Versatile Large Vision Language Model Supporting Long-Contextual Input and Output
Apache License 2.0
2.47k stars 153 forks source link

lora微调后的模型权重要如何在finetune.py中加载,进行二次微调 #279

Open dle666 opened 5 months ago

dle666 commented 5 months ago

现有的模型加载方式会报错,找不到config文件,我是否可以直接用automodel代替,如下图所示 image

yuhangzang commented 5 months ago

现有的模型加载方式会报错,找不到config文件 -> Can you provide the corresponding error log? Thanks.

dle666 commented 5 months ago

现有的模型加载方式会报错,找不到config文件 -> 能否提供相应的错误日志?谢谢。

image

yuhangzang commented 5 months ago

Thanks for your feedback! Do u use the AutoPeftModelForCausalLM class here to load the model?

Starfulllll commented 4 months ago

Thanks for your feedback! Do u use the AutoPeftModelForCausalLM class here to load the model?

您好,感谢您的工作!我想请教一下,使用AutoPeftModelForCausalLM加载模型后,参照finetune.py中的lora设置代码继续训练,出现下面报错如何解决?我确认设置了model.tokenizer,似乎没有成功 _to_regress_embeds, attention_mask, targets, im_mask = self.interleav_wrap( File "/root/.cache/huggingface/modules/transformers_modules/xcomposer2-4khd/modeling_internlm_xcomposer2.py", line 226, in interleav_wrap parttokens = self.tokenizer( TypeError: 'NoneType' object is not callable

WeiminLee commented 3 months ago

Thanks for your feedback! Do u use the AutoPeftModelForCausalLM class here to load the model?

您好,感谢您的工作!我想请教一下,使用AutoPeftModelForCausalLM加载模型后,参照finetune.py中的lora设置代码继续训练,出现下面报错如何解决?我确认设置了model.tokenizer,似乎没有成功 _to_regress_embeds, attention_mask, targets, im_mask = self.interleav_wrap( File "/root/.cache/huggingface/modules/transformers_modules/xcomposer2-4khd/modeling_internlm_xcomposer2.py", line 226, in interleav_wrap parttokens = self.tokenizer( TypeError: 'NoneType' object is not callable

继续训练的代码如下:

Start trainner

trainer = Trainer(
    model=model, tokenizer=tokenizer, args=training_args, **data_module)

trainer.train(resume_from_checkpoint=True)
trainer.save_state()
LinaZhangCoding commented 2 months ago

Thanks for your feedback! Do u use the AutoPeftModelForCausalLM class here to load the model?

您好,感谢您的工作!我想请教一下,使用AutoPeftModelForCausalLM加载模型后,参照finetune.py中的lora设置代码继续训练,出现下面报错如何解决?我确认设置了model.tokenizer,似乎没有成功 _to_regress_embeds, attention_mask, targets, im_mask = self.interleav_wrap( File "/root/.cache/huggingface/modules/transformers_modules/xcomposer2-4khd/modeling_internlm_xcomposer2.py", line 226, in interleav_wrap parttokens = self.tokenizer( TypeError: 'NoneType' object is not callable

I get the same error. I just load it and want to inference , not train, not finetune