Alpha-VLLM / LLaMA2-Accessory

An Open-source Toolkit for LLM Development
https://llama2-accessory.readthedocs.io/
Other
2.63k stars 168 forks source link

Fix quantization for sphinx, sphinx-1k and sphinx-2k #113

Closed linziyi96 closed 8 months ago

linziyi96 commented 8 months ago

Skip the visual backbones during quantization.

linziyi96 commented 8 months ago

97 fixed.