I'm getting a 403 when running the simplest of samples, eg:
---- 8< ----
from gpt4all import GPT4All
model = GPT4All("Meta-Llama-3-8B-Instruct.Q4_0.gguf") # downloads / loads a 4.66GB LLM
with model.chat_session():
print(model.generate("How can I run LLMs efficiently on my laptop?", max_tokens=1024))
---- 8< ----
$ python3 p.py
Traceback (most recent call last):
File "/foo/p.py", line 2, in
model = GPT4All("Meta-Llama-3-8B-Instruct.Q4_0.gguf") # downloads / loads a 4.66GB LLM
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/foo/lib/python3.12/site-packages/gpt4all/gpt4all.py", line 235, in init
self.config: ConfigType = self.retrieve_model(model_name, model_path=model_path, allow_download=allow_download, verbose=verbose)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/foo/lib/python3.12/site-packages/gpt4all/gpt4all.py", line 310, in retrieve_model
available_models = cls.list_models()
^^^^^^^^^^^^^^^^^
File "/foo/lib/python3.12/site-packages/gpt4all/gpt4all.py", line 280, in list_models
raise ValueError(f'Request failed: HTTP {resp.status_code} {resp.reason}')
ValueError: Request failed: HTTP 403 Forbidden
$
Hi,
I'm getting a 403 when running the simplest of samples, eg:
---- 8< ---- from gpt4all import GPT4All model = GPT4All("Meta-Llama-3-8B-Instruct.Q4_0.gguf") # downloads / loads a 4.66GB LLM with model.chat_session(): print(model.generate("How can I run LLMs efficiently on my laptop?", max_tokens=1024)) ---- 8< ----
$ python3 p.py Traceback (most recent call last): File "/foo/p.py", line 2, in
model = GPT4All("Meta-Llama-3-8B-Instruct.Q4_0.gguf") # downloads / loads a 4.66GB LLM
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/foo/lib/python3.12/site-packages/gpt4all/gpt4all.py", line 235, in init
self.config: ConfigType = self.retrieve_model(model_name, model_path=model_path, allow_download=allow_download, verbose=verbose)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/foo/lib/python3.12/site-packages/gpt4all/gpt4all.py", line 310, in retrieve_model
available_models = cls.list_models()
^^^^^^^^^^^^^^^^^
File "/foo/lib/python3.12/site-packages/gpt4all/gpt4all.py", line 280, in list_models
raise ValueError(f'Request failed: HTTP {resp.status_code} {resp.reason}')
ValueError: Request failed: HTTP 403 Forbidden
$
wget -d https://gpt4all.io/models/models3.json works `fine```