flashinfer-ai / flashinfer

FlashInfer: Kernel Library for LLM Serving
https://flashinfer.ai
Apache License 2.0
822 stars 77 forks source link

Support torch 2.3 #227

Closed rkooo567 closed 2 months ago

rkooo567 commented 2 months ago

Hi! Thanks for the awesome library.

vLLM recently upgraded its torch version to 2.3.0. And we have issues when trying to integrate flash infer to it because it doesn't have a wheel built with torch 2.3 yet. Do you guys have any plan to have a release soon with wheels built with torch 2.3?

yzh119 commented 2 months ago

Thank you for bringing this up, we are running the release scripts: https://github.com/flashinfer-ai/flashinfer/actions/runs/8920852220

rkooo567 commented 2 months ago

Thanks for the update! I think we are very close to support flash infer for vllm, and we are very excited about it!

yzh119 commented 2 months ago

Done in https://github.com/flashinfer-ai/flashinfer/releases/tag/v0.0.4