docker run --rm -it -p 8888:8000 -v /home/xstrama/llm/functionary-small-v2.4-GGUF:/models -e MODEL=/models/functionary-small-v2.4.Q4_0.gguf -e CHAT_FORMAT=functionary-v2 -e HF_PRETRAINED_MODEL_NAME_OR_PATH=/models ghcr.io/abetlen/llama-cpp-python:latest
yields the following error when HF_PRETRAINED_MODEL_NAME_OR_PATH is set
File "/app/llama_cpp/llama_tokenizer.py", line 96, in from_pretrained
raise ImportError(
ImportError: The `transformers` library is required to use the `HFTokenizer`.You can install it with `pip install transformers`.
Environment and Context
Docker version: 23.0.4 - Community
OS: Linux XYZ 5.15.0-70-generic 77-Ubuntu SMP Tue Mar 21 14:02:37 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Failure Information (for bugs)
Running Functionary 2.4 Small in docker does not work when HF_PRETRAINED_MODEL_NAME_OR_PATH is provided.
File "/app/llama_cpp/llama_tokenizer.py", line 96, in from_pretrained
raise ImportError(
ImportError: The `transformers` library is required to use the `HFTokenizer`.You can install it with `pip install transformers`.
Steps to Reproduce
Run
docker run --rm -it -p 8888:8000 -v /home/xstrama/llm/functionary-small-v2.4-GGUF:/models -e MODEL=/models/functionary-small-v2.4.Q4_0.gguf -e CHAT_FORMAT=functionary-v2 -e HF_PRETRAINED_MODEL_NAME_OR_PATH=/models ghcr.io/abetlen/llama-cpp-python:latest
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Expected Behavior
Running Functionary 2.4 Small in docker should work when HF_PRETRAINED_MODEL_NAME_OR_PATH is provided
docker run --rm -it -p 8888:8000 -v /home/xstrama/llm/functionary-small-v2.4-GGUF:/models -e MODEL=/models/functionary-small-v2.4.Q4_0.gguf -e CHAT_FORMAT=functionary-v2 -e HF_PRETRAINED_MODEL_NAME_OR_PATH=/models ghcr.io/abetlen/llama-cpp-python:latest
Current Behavior
docker run --rm -it -p 8888:8000 -v /home/xstrama/llm/functionary-small-v2.4-GGUF:/models -e MODEL=/models/functionary-small-v2.4.Q4_0.gguf -e CHAT_FORMAT=functionary-v2 -e HF_PRETRAINED_MODEL_NAME_OR_PATH=/models ghcr.io/abetlen/llama-cpp-python:latest
yields the following error when HF_PRETRAINED_MODEL_NAME_OR_PATH is setEnvironment and Context
Failure Information (for bugs)
Running Functionary 2.4 Small in docker does not work when HF_PRETRAINED_MODEL_NAME_OR_PATH is provided.
Steps to Reproduce
Run
docker run --rm -it -p 8888:8000 -v /home/xstrama/llm/functionary-small-v2.4-GGUF:/models -e MODEL=/models/functionary-small-v2.4.Q4_0.gguf -e CHAT_FORMAT=functionary-v2 -e HF_PRETRAINED_MODEL_NAME_OR_PATH=/models ghcr.io/abetlen/llama-cpp-python:latest