InternLM / InternLM-XComposer

InternLM-XComposer-2.5: A Versatile Large Vision Language Model Supporting Long-Contextual Input and Output
2.14k stars 133 forks source link

请问如何基于InternLM-XComposer2训练lora? #153

Closed bingwork closed 5 months ago

bingwork commented 5 months ago

可以直接使用https://github.com/InternLM/InternLM-XComposer/blob/main/InternLM-XComposer-1.0/finetune/finetune.py吗?感谢! @myownskyW7 @LightDXY @eltociear @yhcao6 @vansin

yuhangzang commented 5 months ago

Our latest commit a377bd8 provides the finetuning code for XComposer2. Please refer to https://github.com/InternLM/InternLM-XComposer/tree/main/finetune

bingwork commented 5 months ago

@yuhangzang, thank you very much for promptly replying to my message and quickly adding a pull request!

Upon examining the lora_target_modules, I noticed that some keys, such as 'mlp.up_proj', 'mlp.down_proj', 'mlp.gate_proj', and 'self_attn.o_proj', are no longer present in the model InternLMXComposer2ForCausalLM.

As I intend to fine-tune Plora_A and Plora_B, I would greatly appreciate any alternative suggestions you might have.

yuhangzang commented 5 months ago

Please try the recent commit 3c522e7 and see if the issue is fixed.

myownskyW7 commented 5 months ago

@bingwork Please check the updated finetuning code.