mbzuai-oryx / LLaVA-pp

🔥🔥 LLaVA++: Extending LLaVA with Phi-3 and LLaMA-3 (LLaVA LLaMA-3, LLaVA Phi-3)
814 stars 61 forks source link

How can I continue training on the pre-trained llava-llama3 model instead of training llama3 directly with my own data? #35

Open ganliqiang opened 1 month ago

ganliqiang commented 1 month ago

How can I continue training on the pre-trained llava-llama3 model instead of training llama3 directly with my own data? In my experiments, if I continue training on the llava-llama3 model, the generated results are garbled. How can I solve this issue? Thank you.@

ganliqiang commented 1 month ago

@mmaaz60