issues
search
flashinfer-ai
/
flashinfer
FlashInfer: Kernel Library for LLM Serving
https://flashinfer.ai
Apache License 2.0
768
stars
64
forks
source link
doc: fix logits cap docstring
#300
Closed
yzh119
closed
3 weeks ago
yzh119
commented
3 weeks ago
follow up of #299, pre-attention -> pre-softmax.
follow up of #299, pre-attention -> pre-softmax.