issues
search
kvcache-ai
/
ktransformers
A Flexible Framework for Experiencing Cutting-edge LLM Inference Optimizations
Apache License 2.0
653
stars
31
forks
source link
[fix] f16 dequantize device ignored
#51
Closed
molamooo
closed
3 weeks ago