issues
search
flashinfer-ai
/
flashinfer
FlashInfer: Kernel Library for LLM Serving
https://flashinfer.ai
Apache License 2.0
760
stars
64
forks
source link
bugfix: fix std::max mismatch in #333
#334
Closed
yzh119
closed
1 week ago