mit-han-lab / llm-awq

[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
MIT License
2.56k stars 208 forks source link

Update Helpful links #208

Closed ys-2020 closed 4 months ago

ys-2020 commented 4 months ago

@kentang-mit . We updated helpful links in this PR. Could you please review it? Thanks.