mit-han-lab / llm-awq

[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
MIT License
2.08k stars 150 forks source link

显卡要求 #202

Open kplxwb opened 1 week ago

kplxwb commented 1 week ago

您好,我有单张4090和六张RTX 2080Ti和四张RTX 3080,请问能满足这个项目的显存要求吗?