issues
search
mit-han-lab
/
llm-awq
[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
MIT License
2.55k
stars
207
forks
source link
Update README.md
#205
Closed
kentang-mit
closed
4 months ago