Closed rkooo567 closed 2 months ago
Thank you for bringing this up, we are running the release scripts: https://github.com/flashinfer-ai/flashinfer/actions/runs/8920852220
Thanks for the update! I think we are very close to support flash infer for vllm, and we are very excited about it!
Hi! Thanks for the awesome library.
vLLM recently upgraded its torch version to 2.3.0. And we have issues when trying to integrate flash infer to it because it doesn't have a wheel built with torch 2.3 yet. Do you guys have any plan to have a release soon with wheels built with torch 2.3?