issues
search
flashinfer-ai
/
flashinfer
FlashInfer: Kernel Library for LLM Serving
https://flashinfer.ai
Apache License 2.0
760
stars
64
forks
source link
perf: change minimal `kv_chunk_size` back to 128
#329
Closed
yzh119
closed
1 week ago