Aaronhuang-778 / BiLLM

(ICML 2024) BiLLM: Pushing the Limit of Post-Training Quantization for LLMs
https://arxiv.org/abs/2402.04291
MIT License
155 stars 12 forks source link

Looking forward to supporting Mixtral_8x7b MoE #10

Open Gierry opened 3 months ago

Gierry commented 3 months ago

Looking forward to supporting Mixtral_8x7b MoE

Gierry commented 2 months ago

also Looking forward to supporting qwen1.5....