InternLM / xtuner

An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
https://xtuner.readthedocs.io/zh-cn/latest/
Apache License 2.0
3.72k stars 302 forks source link

Convert .pth of lava-phi3-lora failed #745

Open xsx1001 opened 3 months ago

xsx1001 commented 3 months ago

I finetuned llava-phi3 model with lora, but when I try to convert the resulting weight, an error occurred.

This I my command: xtuner convert pth_to_hf ./my_configs/llava_phi3_mini_qlora_clip_vit_large_p14_336_lora_e1_gpu8_finetune_copy.py /mnt/xushixiong/outputs/xtuner/work_dirs/llava_phi3_mini_qlora_clip_vit_large_p14_336_lora_e1_gpu8_finetune_copy/iter_17920.pth/mp_rank_00_model_states.pt ./outputs/sf_0604_xtuner

image The error looks like this, it seems that the convert of llava-phi3-lora is not supported? what can I do?

hhaAndroid commented 3 months ago

@xsx1001

xtuner convert pth_to_hf ./my_configs/llava_phi3_mini_qlora_clip_vit_large_p14_336_lora_e1_gpu8_finetune_copy.py /mnt/xushixiong/outputs/xtuner/work_dirs/llava_phi3_mini_qlora_clip_vit_large_p14_336_lora_e1_gpu8_finetune_copy/iter_17920.pth ./outputs/sf_0604_xtuner
nakoeni commented 3 months ago

How'd you even pre-train llava-phi-3? When I run xtuner train llava_phi3_mini_4k_instruct_clip_vit_large_p14_336_e1_gpu8_pretrain --seed 1024 from the xtuner directory I get this error: "raise FileNotFoundError(f'Cannot find {args.config}') FileNotFoundError: Cannot find llava_phi3_mini_4k_instruct_clip_vit_large_p14_336_e1_gpu8_pretrain"

xsx1001 commented 3 months ago

thx for reply! I am also wandering how to continue SFT the llava-models with xtuner. I noticed there is no open-sourced .pth checkpoints after SFT. How can I convert the hf models to .pth?