Alpha-VLLM / LLaMA2-Accessory

An Open-source Toolkit for LLM Development
https://llama2-accessory.readthedocs.io/
Other
2.72k stars 176 forks source link

Fix quantization for sphinx, sphinx-1k and sphinx-2k #113

Closed linziyi96 closed 1 year ago

linziyi96 commented 1 year ago

Skip the visual backbones during quantization.

linziyi96 commented 1 year ago

97 fixed.