bentoml / OpenLLM

Run any open-source LLMs, such as Llama, Mistral, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
10.11k stars 640 forks source link

Fresh install complains about vllm-flash-attn==2.5.9.post1 dependency being unsatisfiable #1055

Closed andrewjwaggoner closed 3 months ago

andrewjwaggoner commented 3 months ago

Hello, I'm struggling with getting a clean install running. I tried this with both python 3.10 and 3.11, after not seeing a vllm-flash-attn package available for 3.12. Here's my version information:

openllm -v
openllm, 0.6.7
Python (CPython) 3.11.9
pip 24.1.2

Steps to a fresh install

python3.11 -m venv ai
source ai/bin/activate
pip install openllm
(ai) user@wallflower ~/.venv/ai $ openllm hello
  Detected Platform: linux
  Detected Accelerators: 
   - NVIDIA GeForce RTX 4090 24GB
? Select a model gemma     default  Yes
? Select a version gemma:2b       Yes
? Select an action 0. Run the model in terminal
Installing model dependencies(/home/user/.openllm/venv/712292760116542944)...

$ python -m uv venv /home/user/.openllm/venv/712292760116542944
Using Python 3.12.3 interpreter at: /usr/lib/python-exec/python3.12/python3
Creating virtualenv at: /home/user/.openllm/venv/712292760116542944
Activate with: source /home/user/.openllm/venv/712292760116542944/bin/activate
$ python -m uv pip install -p /home/user/.openllm/venv/712292760116542944/bin/python bentoml
Resolved 76 packages in 22ms
Installed 76 packages in 35ms
 + aiohappyeyeballs==2.3.4
 + aiohttp==3.10.0
 + aiosignal==1.3.1
 + aiosqlite==0.20.0
 + annotated-types==0.7.0
 + anyio==4.4.0
 + appdirs==1.4.4
 + asgiref==3.8.1
 + attrs==23.2.0
 + bentoml==1.3.1
 + cattrs==23.1.2
 + certifi==2024.7.4
 + circus==0.18.0
 + click==8.1.7
 + click-option-group==0.5.6
 + cloudpickle==3.0.0
 + deepmerge==1.1.1
 + deprecated==1.2.14
 + frozenlist==1.4.1
 + fs==2.4.16
 + h11==0.14.0
 + httpcore==1.0.5
 + httpx==0.27.0
 + httpx-ws==0.6.0
 + idna==3.7
 + importlib-metadata==6.11.0
 + inflection==0.5.1
 + inquirerpy==0.3.4
 + jinja2==3.1.4
 + markdown-it-py==3.0.0
 + markupsafe==2.1.5
 + mdurl==0.1.2
 + multidict==6.0.5
 + numpy==2.0.1
 + nvidia-ml-py==11.525.150
 + opentelemetry-api==1.20.0
 + opentelemetry-instrumentation==0.41b0
 + opentelemetry-instrumentation-aiohttp-client==0.41b0
 + opentelemetry-instrumentation-asgi==0.41b0
 + opentelemetry-sdk==1.20.0
 + opentelemetry-semantic-conventions==0.41b0
 + opentelemetry-util-http==0.41b0
 + packaging==24.1
 + pathspec==0.12.1
 + pfzy==0.3.4
 + pip-requirements-parser==32.0.1
 + prometheus-client==0.20.0
 + prompt-toolkit==3.0.47
 + psutil==6.0.0
 + pydantic==2.8.2
 + pydantic-core==2.20.1
 + pygments==2.18.0
 + pyparsing==3.1.2
 + python-dateutil==2.9.0.post0
 + python-json-logger==2.0.7
 + python-multipart==0.0.9
 + pyyaml==6.0.1
 + pyzmq==26.0.3
 + rich==13.7.1
 + schema==0.7.7
 + setuptools==72.1.0
 + simple-di==0.1.5
 + six==1.16.0
 + sniffio==1.3.1
 + starlette==0.38.2
 + tomli-w==1.0.0
 + tornado==6.4.1
 + typing-extensions==4.12.2
 + uv==0.2.33
 + uvicorn==0.30.5
 + watchfiles==0.22.0
 + wcwidth==0.2.13
 + wrapt==1.16.0
 + wsproto==1.2.0
 + yarl==1.9.4
 + zipp==3.19.2

$ python -m uv pip install -p /home/user/.openllm/venv/712292760116542944/bin/python -r /home/user/.openllm/venv/712292760116542944/requirements.txt
  × No solution found when resolving dependencies:
  ╰─▶ Because vllm-flash-attn==2.5.9.post1 has no wheels with a matching Python ABI tag and you require vllm-flash-attn==2.5.9.post1, we can
      conclude that the requirements are unsatisfiable.
bojiang commented 3 months ago

@andrewjwaggoner It seems your platform is not x86_64 linux. Would you like to share more about your environment?

andrewjwaggoner commented 3 months ago

Hello, it should be.


user@wallflower ~/.config/sway $ uname -a
Linux wallflower 6.6.38-gentoo #2 SMP PREEMPT_DYNAMIC Sat Jul 13 15:24:11 EDT 2024
 x86_64 AMD Ryzen 9 7950X3D 16-Core Processor AuthenticAMD GNU/Linux
user@wallflower ~/.config/sway $ uname -r
6.6.38-gentoo
user@wallflower ~/.config/sway $ lsb_release -a
LSB Version:    n/a
Distributor ID: Gentoo
Description:    Gentoo Linux
Release:    2.15
Codename:   n/a

If we think it's a wonky environment, maybe I should try it on a machine with a different distro.

bojiang commented 3 months ago

@andrewjwaggoner Should be fixed. Run openllm repo update should do the trick.

kishanios123 commented 3 months ago

@andrewjwaggoner Should be fixed. Run openllm repo update should do the trick.

unable to solve with openllm repo update, is there any other solution ?

sayakmisra commented 3 months ago

Having the same issue, @bojiang running openllm repo update didn't help.

kishanios123 commented 3 months ago

Having the same issue, @bojiang running openllm repo update didn't help.

I have fixed this by creating a new env with python 3.9 ... try it once ...