issues
search
flashinfer-ai
/
flashinfer
FlashInfer: Kernel Library for LLM Serving
https://flashinfer.ai
Apache License 2.0
1.48k
stars
147
forks
source link
doc: improve the docstring of `append_paged_kv_cache`
#606
Closed
yzh119
closed
2 weeks ago
yzh119
commented
2 weeks ago
Remove unnecessary note.
Remove unnecessary note.