rxtan2 / Koala-video-llm

BSD 3-Clause "New" or "Revised" License
30 stars 5 forks source link

Clarification on model weights for EgoSchema eval #8

Open yukw777 opened 2 months ago

yukw777 commented 2 months ago

Could you specify the exact model used for EgoSchema eval? The paper states that the LLM backbone used for EgoSchema eval is LLaMA-2, but README states that Vicuna weights were used. If LLaMA-2 was indeed used for EgoSchema eval, I'm assuming llama-2-7b-chat-hf and the corresponding minigpt4 weights were used (the minigpt4 weights linked in README seem to be for Vicuna 13b v0). Does this also mean that the provided pre-trained checkpoint is for llama-2-7b-chat-hf?

rxtan2 commented 1 month ago

Yes, sorry I will update the README to reflect this. In the meantime, please use 'llama-2-7b-chat-hf'. Thank you very much!