Installed llama via info from https://www.llama.com/llama-downloads/ using the pip install llama-stack command. Running the command to retrieve model types, I see the following warning:
(same error seen with llama model list
user@Merovingian2:~$ llama model list --show-all /home/david/.local/lib/python3.12/site-packages/pydantic/_internal/_fields.py:172: UserWarning: Field name "schema" in "JsonResponseFormat" shadows an attribute in parent "BaseModel" warnings.warn(
It does display a listing of the models, but when trying to run any of the models, I see:
user@Merovingian2:~$ llama model download --source meta --model-id Llama-3.2-3B/home/david/.local/lib/python3.12/site-packages/pydantic/_internal/_fields.py:172: UserWarning: Field name "schema" in "JsonResponseFormat" shadows an attribute in parent "BaseModel" warnings.warn( usage: llama model download [-h] [--source {meta,huggingface}]
Installed llama via info from https://www.llama.com/llama-downloads/ using the
pip install llama-stack
command. Running the command to retrieve model types, I see the following warning: (same error seen withllama model list
user@Merovingian2:~$ llama model list --show-all /home/david/.local/lib/python3.12/site-packages/pydantic/_internal/_fields.py:172: UserWarning: Field name "schema" in "JsonResponseFormat" shadows an attribute in parent "BaseModel" warnings.warn(
It does display a listing of the models, but when trying to run any of the models, I see:
user@Merovingian2:~$ llama model download --source meta --model-id Llama-3.2-3B/home/david/.local/lib/python3.12/site-packages/pydantic/_internal/_fields.py:172: UserWarning: Field name "schema" in "JsonResponseFormat" shadows an attribute in parent "BaseModel" warnings.warn( usage: llama model download [-h] [--source {meta,huggingface}]
llama model download: error: Model Llama-3.2-3B not found
is the naming scheme in the list wrong or am I missing part of the name? e.g.
meta-llama/Llama-3.2-3B