hiyouga / LLaMA-Factory

Unify Efficient Fine-Tuning of 100+ LLMs
Apache License 2.0
25.26k stars 3.13k forks source link

How to pre-train Llava1.5 from vicuna1.5? #4440

Closed yuzhms closed 4 days ago

yuzhms commented 5 days ago

Reminder

System Info

NA

Reproduction

NA

Expected behavior

No response

Others

How to conduct the feature alignment (Pretrain https://github.com/haotian-liu/LLaVA/tree/main?tab=readme-ov-file#pretrain-feature-alignment) stage of Llava 1.5 in LlamaFactory? Any example configs?

hiyouga commented 4 days ago

This feature is not yet supported