meta-llama / llama-models

Utilities intended for use with Llama models.
Other
4.88k stars 838 forks source link

[LLAMA 502] llama command are not getting recognized #187

Open Seemachauhan13 opened 1 month ago

Seemachauhan13 commented 1 month ago

I have run pip install llama-stack, following readme file, though it is installed successfully , "llama model list --show-all llama: command not found" But I see llama commnad are not getting recognized, need to get this resolved toi proceed further //***** Linux terminal#:~/llama-stack$ pip install llama-stack Requirement already satisfied: llama-stack in /home/seema1/.local/lib/python3.8/site-packages (0.0.1a5) Requirement already satisfied: httpx<1,>=0.23.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (0.27.2) Requirement already satisfied: pydantic<3,>=1.9.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (2.9.2) Requirement already satisfied: distro<2,>=1.7.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (1.9.0) Requirement already satisfied: sniffio in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (1.3.1) Requirement already satisfied: anyio<5,>=3.5.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (4.5.2) Requirement already satisfied: typing-extensions<5,>=4.7 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (4.12.2) Requirement already satisfied: idna in /usr/lib/python3/dist-packages (from httpx<1,>=0.23.0->llama-stack) (2.8) Requirement already satisfied: httpcore==1. in /home/seema1/.local/lib/python3.8/site-packages (from httpx<1,>=0.23.0->llama-stack) (1.0.6) Requirement already satisfied: certifi in /usr/lib/python3/dist-packages (from httpx<1,>=0.23.0->llama-stack) (2019.11.28) Requirement already satisfied: pydantic-core==2.23.4 in /home/seema1/.local/lib/python3.8/site-packages (from pydantic<3,>=1.9.0->llama-stack) (2.23.4) Requirement already satisfied: annotated-types>=0.6.0 in /home/seema1/.local/lib/python3.8/site-packages (from pydantic<3,>=1.9.0->llama-stack) (0.7.0) Requirement already satisfied: exceptiongroup>=1.0.2; python_version < "3.11" in /home/seema1/.local/lib/python3.8/site-packages (from anyio<5,>=3.5.0->llama-stack) (1.2.2) Requirement already satisfied: h11<0.15,>=0.13 in /home/seema1/.local/lib/python3.8/site-packages (from httpcore==1.->httpx<1,>=0.23.0->llama-stack) (0.14.0) Liunx terminal#:~/llama-stack$ llama model list --show-all llama: command not found

skmda37 commented 4 weeks ago

I have the same problem

cglagovichTT commented 4 weeks ago

I've found that it silently fails if anything is wrong with your env. Try ensuring that you have installed python3.10, then create a python3.10 venv and build from source with pip install -e .

monchewharry commented 4 weeks ago

Create an environment with python3.10 and install it:

conda create -n stack python=3.10
pip install llama-stack

Refer to: https://github.com/meta-llama/llama-stack/blob/main/docs/getting_started.md The current installation code won't check the python version.....

L-I-M-I-T commented 3 weeks ago

Create an environment with python3.10 and install it:

conda create -n stack python=3.10
pip install llama-stack

Refer to: https://github.com/meta-llama/llama-stack/blob/main/docs/getting_started.md The current installation code won't check the python version.....

Thank you, I believe this is the correct solution.

paulorenanmelo commented 2 weeks ago

I had the same issue. The only thing that worked for me was installing python 3.10.11, I was previously on 3.9

skmda37 commented 2 weeks ago

using python=3.10 did worked for me as well