issues
search
mit-han-lab
/
llm-awq
[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
MIT License
2.08k
stars
150
forks
source link
[Minor] Update README.
#168
Closed
ys-2020
closed
3 months ago
ys-2020
commented
3 months ago
Updated industrial impacts & fix links in the README.
Updated industrial impacts & fix links in the README.