mit-han-lab / llm-awq

[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
MIT License
2.38k stars 184 forks source link

openAI-compatible tinychat API? #188

Open DiTo97 opened 4 months ago

DiTo97 commented 4 months ago

is tinychat inference server API openAI-compatible?