bentoml / OpenLLM

Run any open-source LLMs, such as Llama, Mistral, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
10.11k stars 640 forks source link

Struggling to launch OpenLLM with error "No module named bentoml #1053

Closed andrewjwaggoner closed 3 months ago

andrewjwaggoner commented 3 months ago

I'm trying to do a fresh setup of OpenLLM on my linux box, here's what I'm seeing:

python3.11 -m venv ai
source ai/bin/activate
pip install openllm

(ai) user@wallflower ~/.venv $ openllm repo update
(ai) user@wallflower ~/.venv $ openllm hello
  Detected Platform: linux
  Detected Accelerators: 
   - NVIDIA GeForce RTX 4090 24GB
? Select a model gemma     default  Yes
? Select a version gemma:2b       Yes
? Select an action 0. Run the model in terminal

$ export BENTOML_HOME=/home/user/.openllm/repos/github.com/bentoml/openllm-models/main/bentoml
$ source /home/user/.openllm/venv/608482142055440317/bin/activate
$ bentoml serve gemma:2b-instruct-fp16-f020 --port 33009
Model server started 2459797
Model loading...
/home/user/.openllm/venv/608482142055440317/bin/python: No module named bentoml

Here's the version information

openllm -v
openllm, 0.6.6
Python (CPython) 3.11.9
pip 24.1.2

I'm not sure what to look at next to debug this. Anyone have any advice?

bojiang commented 3 months ago

Hi. Should be fixed in 0.6.7.

Please update your openllm, then

openllm clean venvs
openllm hello
andrewjwaggoner commented 3 months ago

This got me further. Thanks.