Open xsx1001 opened 3 months ago
@xsx1001
xtuner convert pth_to_hf ./my_configs/llava_phi3_mini_qlora_clip_vit_large_p14_336_lora_e1_gpu8_finetune_copy.py /mnt/xushixiong/outputs/xtuner/work_dirs/llava_phi3_mini_qlora_clip_vit_large_p14_336_lora_e1_gpu8_finetune_copy/iter_17920.pth ./outputs/sf_0604_xtuner
How'd you even pre-train llava-phi-3? When I run xtuner train llava_phi3_mini_4k_instruct_clip_vit_large_p14_336_e1_gpu8_pretrain --seed 1024
from the xtuner directory I get this error: "raise FileNotFoundError(f'Cannot find {args.config}')
FileNotFoundError: Cannot find llava_phi3_mini_4k_instruct_clip_vit_large_p14_336_e1_gpu8_pretrain"
thx for reply! I am also wandering how to continue SFT the llava-models with xtuner. I noticed there is no open-sourced .pth checkpoints after SFT. How can I convert the hf models to .pth?
I finetuned llava-phi3 model with lora, but when I try to convert the resulting weight, an error occurred.
This I my command: xtuner convert pth_to_hf ./my_configs/llava_phi3_mini_qlora_clip_vit_large_p14_336_lora_e1_gpu8_finetune_copy.py /mnt/xushixiong/outputs/xtuner/work_dirs/llava_phi3_mini_qlora_clip_vit_large_p14_336_lora_e1_gpu8_finetune_copy/iter_17920.pth/mp_rank_00_model_states.pt ./outputs/sf_0604_xtuner
The error looks like this, it seems that the convert of llava-phi3-lora is not supported? what can I do?