flashinfer-ai / flashinfer

FlashInfer: Kernel Library for LLM Serving
https://flashinfer.ai
Apache License 2.0
1.1k stars 98 forks source link

Vllm support #202

Closed MikeChenfu closed 2 days ago

MikeChenfu commented 4 months ago

Hello, I see there was a PR for VLLM support but it was not active since Feb. I wonder if Flashinfer has a roadmap for vllm support. Many thanks. @yzh119

yzh119 commented 2 days ago

Sorry for the very late reply.

vLLM has FlashInfer backend since a few months ago.