Aaronhuang-778 / BiLLM

(ICML 2024) BiLLM: Pushing the Limit of Post-Training Quantization for LLMs
https://arxiv.org/abs/2402.04291
MIT License
155 stars 12 forks source link

Update README.md #3

Closed eltociear closed 4 months ago

eltociear commented 4 months ago

Dependences -> Dependencies

htqin commented 4 months ago

Thanks for the comment!