Closed anhnh2002 closed 3 months ago
I tried vllm v0.5.2/v0.5.3 and got same errors
It seems flashinfer is not installed. Could you just check if flashinfer is installed correctly by python -c "import flashinfer"
. flashinfer is not installed by default with vllm, you need to install flashinfer manually by installing the correct version (https://github.com/flashinfer-ai/flashinfer/releases/tag/v0.1.3).
I solved it after install flashinfer using: https://github.com/vllm-project/vllm/blob/db35186391a2abfc6c91d703527dac20d2488107/Dockerfile#L195
Feel free to reopen the issue if there are more questions.
Your current environment
🐛 Describe the bug
logs