Alpha-VLLM / LLaMA2-Accessory

An Open-source Toolkit for LLM Development
https://llama2-accessory.readthedocs.io/
Other
2.61k stars 167 forks source link

The performance of SPHINX Mixtral MoE #168

Closed zhongshsh closed 4 months ago

zhongshsh commented 4 months ago

What is the performance of SPHINX Mixtral MoE in some general VQA benchmarks as shown in https://github.com/Alpha-VLLM/LLaMA2-Accessory/blob/main/SPHINX/SPHINX_paper.pdf ? image

gaopengpjlab commented 4 months ago

Please refer to SPHINX-MoE paper.

https://arxiv.org/pdf/2402.05935.pdf

zhongshsh commented 4 months ago

Many thanks!