flashinfer-ai / flashinfer

FlashInfer: Kernel Library for LLM Serving
https://flashinfer.ai
Apache License 2.0
760 stars 64 forks source link

Lacks prebuild whl for PyTorch2.3+cu118 #314

Closed heheda12345 closed 2 weeks ago

heheda12345 commented 2 weeks ago

This link from Installation Guide is 404: https://flashinfer.ai/whl/cu118/torch2.3/ . Can you help to upload the prebuild whl?

yzh119 commented 2 weeks ago

Thanks for reporting, I'll fix it for v0.0.5

yzh119 commented 2 weeks ago

Website was fixed. v0.0.5 release is building: https://github.com/flashinfer-ai/flashinfer/actions/runs/9594555333, hopefully you will see it soon.