BAAI-DCAI / Bunny

A family of lightweight multimodal models.
Apache License 2.0
865 stars 65 forks source link

LLaMA3 or LLaMA3-Instruct #47

Closed berry-ding closed 2 months ago

berry-ding commented 4 months ago

Great work! I want to know if your pre-training used LLaMA 3 or LLaMA 3-Instruct.

BAAI-DCAI commented 4 months ago

LLaMA 3 Base Model. Thanks!

GewelsJI commented 4 months ago

Hey, @BAAI-DCAI team,

Any experience to share what is the difference of these two versions? Why choose llama 3 base model but not instruct-tuned model?

Isaachhh commented 4 months ago

Our primary experiments are based on Llama-3-8B. And we then find that using instruct-tuned model would be better. Now we have updated the weights based on Llama-3-8B-Instruct.

berry-ding commented 4 months ago

Our primary experiments are based on Llama-3-8B. And we then find that using instruct-tuned model would be better. Now we have updated the weights based on Llama-3-8B-Instruct.

Great news! Looking forward to your release of the fine-tuning strategies.

GewelsJI commented 4 months ago

@Isaachhh

Do you have plans to go further with Phi-3?

Isaachhh commented 4 months ago

@GewelsJI

Please refer to https://huggingface.co/BAAI/Bunny-v1_0-4B.

The GitHub would be updated soon and we are still working on improving the performance of Bunny-Llama-3-8B-V and Bunny-v1.0-4B. Stay tuned!

GewelsJI commented 4 months ago

That's awesome. Keep attention on your updates. Thanks.

GewelsJI commented 3 months ago

@Isaachhh

Further question is: do you guys wanna play with Gemma models in your codebase?

Isaachhh commented 3 months ago

@GewelsJI Hi, we conducted some experiments about Bunny-Gemma on mid March and I uploaded the related codes into gemma_temp branch. Note that the version and conv_mode should be gemma.

But we can't guarantee that it works well now. And we may not release the model weights recently.

Hope this can help you. Feel free to comment if you have further questions.

galleon commented 3 months ago

@Isaachhh I would like to fine tune Bunny-Llama-3-8B-V on some of my data. Can I use the existing train.py file or should I wait for better VIT strategy you mentioned in the README.md

thanks for your work

lucasjinreal commented 3 months ago

They opened vit in both pretrain and sft actually but didn't opensource the recipe.

Isaachhh commented 3 months ago

They opened vit in both pretrain and sft actually but didn't opensource the recipe.

The strategy only differs in the visual instruction tuning stage. And the vision tower was frozen under pre-training stage.

galleon commented 3 months ago

I [think] was able to finetune the adapter starting from phi2 pretrain weights. Any plan to release those weights for phi3 and llama3 ?

Isaachhh commented 3 months ago

@galleon

We have released that.

galleon commented 3 months ago

@IsaachhhI was not able to find them on 🤗. I am talking abt this BAAI/bunny-pretrain-phi-2-siglip but for phi-3 or llama3 … May be it can be extracted from the full model ? How ?Sent from a small deviceOn 8 May 2024, at 14:50, Isaachhh @.***> wrote: @galleon We have released that.

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>

Isaachhh commented 3 months ago

@galleon

Training details of model zoo

Screenshot 2024-05-08 at 22 00 13
galleon commented 3 months ago

Thanks for adding that !Sent from a small deviceOn 8 May 2024, at 16:01, Isaachhh @.***> wrote: @galleon Training details of model zoo Screenshot.2024-05-08.at.22.00.13.png (view on web)

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>

berry-ding commented 3 months ago

@Isaachhh hi master, do you have any plan to release the high resolution LLaMA3-based model ?

Isaachhh commented 3 months ago

@berry-ding Thanks for interest. In following weeks, stay tuned!

Isaachhh commented 2 months ago

@berry-ding Hi, we released Bunny-v1.1-Llama-3-8B-V supporting 1152x1152.

Isaachhh commented 2 months ago

Close the issue for now if there's no further discussions. Feel free to reopen it if there's any other questions.