issues
search
mit-han-lab
/
llm-awq
[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
MIT License
2.38k
stars
184
forks
source link
upload model_worker_new.py
#182
Closed
ys-2020
closed
4 months ago
ys-2020
commented
4 months ago
upload model_worker_new.py to fix #180
upload model_worker_new.py to fix #180