rhohndorf / Auto-Llama-cpp

Uses Auto-GPT with Llama.cpp
MIT License
384 stars 68 forks source link

Hard code file location of json.gbnf #31

Closed chiu0602 closed 1 year ago

chiu0602 commented 1 year ago

Duplicates

Steps to reproduce 🕹

in scripts/llm_utils.py:grammar, there is a line of code

grammar = LlamaGrammar.from_file("/home/ruben/Code/Auto-Llama-cpp/grammars/json.gbnf")

The line of code should not read a file with absolute path, but I am not should what should be used instead.

Current behavior 😯

Error is prompt and program terminated. As I use Docker, I need to modify Dockerfile and add the lines below and build the image:

RUN mkdir -p /home/ruben/Code/Auto-Llama-cpp
COPY grammars /home/ruben/Code/Auto-Llama-cpp/grammars

Expected behavior 🤔

The application should be able to build by

docker build -t auto-llama .

And run with

docker run -it --env-file "./.env" -v "<MODEL_PATH>:/models" auto-llama

Your prompt 📝

# Paste your prompt here