Closed Travistyse closed 7 months ago
I believe that is due to LLama changing their code: https://github.com/ggerganov/llama.cpp/pull/1508
Edit requirements.txt; find the line with llama-cpp-python and change it to this:
llama-cpp-python==0.1.53
I believe that is due to LLama changing their code: ggerganov/llama.cpp#1508
find the line with llama-cpp-python and change it to this:
llama-cpp-python==0.1.53
In requirements.txt?
Yes. Afterwards run python -m pip -r requirements.txt
to update.
Alternately, you can manually update llama-cpp-python by executing python -m pip install --upgrade llama-cpp-python
E:\Projects\Ai LLM\privateGPT>python privateGPT.py Using embedded DuckDB with persistence: data will be stored in: db llama.cpp: loading model from models/Manticore-13B.ggmlv3.q4_1.bin error loading model: unknown (magic, version) combination: 67676a74, 00000003; is this really a GGML file? llama_init_from_file: failed to load model Traceback (most recent call last): File "E:\Projects\Ai LLM\privateGPT\privateGPT.py", line 75, in
main()
File "E:\Projects\Ai LLM\privateGPT\privateGPT.py", line 33, in main
llm = LlamaCpp(model_path=model_path, n_ctx=model_n_ctx, callbacks=callbacks, verbose=False)
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for LlamaCpp
root
Could not load Llama model from path: models/Manticore-13B.ggmlv3.q4_1.bin. Received error (type=value_error)
Side note, and this may be valuable information but:
I created a symlink to my obsidian vault I installed some C++ libraries from MS, keywords being help visual c downloads and 2977003 I pip installed pillow to reinstall it I used VS Code to run import nltk
nltk.download('punkt') nltk.download('averaged_perceptron_tagger')