Closed b421 closed 2 months ago
I get this error on llm models list
, but also after running the example from the readme after a fresh install: llm -m orca-mini-3b-gguf2-q4_0 '3 names for a pet cow'
Confirming the same for original issue and comment
@b421 Not working in my case. Is there any update on this. I am also facing same issue
Thanks for the tip!
Adding except (FileNotFoundError, ValueError):
worked for me
in the newest version of gpt4all, it still exists. i solve it by hacking retrieve_model
(gpt4all.py, line 399)
model_dest = model_path / model_filename
if model_dest.exists():
config["path"] = str(model_dest)
if verbose:
print(f"Found model file at {str(model_dest)!r}", file=sys.stderr)
elif allow_download:
# If model file does not exist, download
filesize = config.get("filesize")
config["path"] = str(
cls.download_model(
model_filename,
model_path,
verbose=verbose,
url=config.get("url"),
expected_size=None if filesize is None else int(filesize),
expected_md5=config.get("md5sum"),
)
)
else:
# raise FileNotFoundError(f"Model file does not exist: {model_dest!r}")
# remove it, do the same thing as above
filesize = config.get("filesize")
config["path"] = str(
cls.download_model(
model_filename,
model_path,
verbose=verbose,
url=config.get("url"),
expected_size=None if filesize is None else int(filesize),
expected_md5=config.get("md5sum"),
)
)
Running
llm models list
after installing the llm-gpt4all plugin in a clean pipx environment returns this:It looks like this is due to this commit in gpt4all from last week: https://github.com/nomic-ai/gpt4all/commit/255568fb9a5201c0e9d9d2679772392784448f89
where
retrieve_model()
now returnsFileNotFoundError
if the$HOME/.cache/gpt4all
folder is emptyHacky fix in
is_installed()
: