Open noahsnowc opened 1 year ago
python run_localGPT.py this command is showing syntax error when run on google colab
I, too, ran: python run_localGPT.py --device_type cpu in an Anaconda shell on a Windows 10 machine. The error I got is:
2023-09-04 20:44:58,753 - INFO - run_localGPT.py:50 - Using Llamacpp for GGML quantized models
llama.cpp: loading model from C:\Users\Owner.cache\huggingface\hub\models--TheBloke--Llama-2-7B-Chat-GGML\snapshots\00109c56c85ca9015795ca06c272cbc65c7f1dbf\llama-2-7b-chat.ggmlv3.q4_0.bin
error loading model: unknown (magic, version) combination: 67676a74, 00000003; is this really a GGML file?
llama_init_from_file: failed to load model
Traceback (most recent call last):
File "D:\Projects\libraries\GPTs\localGPT\localGPT-main\run_localGPT.py", line 246, in
The file exists: From my Windows 10 File Manager: C:\Users\Owner.cache\huggingface\hub\models--TheBloke--Llama-2-7B-Chat-GGML\snapshots\00109c56c85ca9015795ca06c272cbc65c7f1dbf\llama-2-7b-chat.ggmlv3.q4_0.bin
Please advise. Thank you.
Actions taken:
Ran the command python run_localGPT.py --device_type cpu
Ingest.py --device_type cpu was ran before this with no issues.
Expected result:
For the "> Enter a query:" prompt to appear in terminal
Actual Result:
OSError: Unable to load weights from pytorch checkpoint file for 'C:\Users\/.cache\huggingface\hub\models--TheBloke--vicuna-7B-1.1-HF\snapshots\c3efe0b1dd78716c6bfc288a997026354bce441a\pytorch_model-00001-of-00002.bin' at 'C:\Users\/.cache\huggingface\hub\models--TheBloke--vicuna-7B-1.1-HF\snapshots\c3efe0b1dd78716c6bfc288a997026354bce441a\pytorch_model-00001-of-00002.bin'. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.
Additional info:
Adding from_tf=True, after this "model = LlamaForCausalLM.from_pretrained(model_path, (from_tf=True added here)"
gives me the error: OSError: TheBloke/vicuna-7B-1.1-HF does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.
I have re downloaded the model multiple times now.