simonw / llm-gpt4all

Plugin for LLM adding support for the GPT4All collection of models
Apache License 2.0
194 stars 19 forks source link

llm models command failing #24

Closed jdorri closed 2 months ago

jdorri commented 3 months ago

After installing the llm-gpt4all plugin and running llm models, i get the following error on my Mac. Any ideas?

Traceback (most recent call last): File "/opt/homebrew/bin/llm", line 8, in <module> sys.exit(cli()) ^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/click/core.py", line 1157, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/click/core.py", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/llm/cli.py", line 799, in models_list for model_with_aliases in get_models_with_aliases(): ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/llm/__init__.py", line 80, in get_models_with_aliases pm.hook.register_models(register=register) File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/pluggy/_hooks.py", line 501, in __call__ return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/pluggy/_manager.py", line 119, in _hookexec return self._inner_hookexec(hook_name, methods, kwargs, firstresult) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/pluggy/_callers.py", line 138, in _multicall raise exception.with_traceback(exception.__traceback__) File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/pluggy/_callers.py", line 102, in _multicall res = hook_impl.function(*args) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/llm_gpt4all.py", line 57, in register_models models.sort( File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/llm_gpt4all.py", line 59, in <lambda> not model.is_installed(), ^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/llm_gpt4all.py", line 179, in is_installed GPT4All.retrieve_model( File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/llm_gpt4all.py", line 38, in retrieve_model return _GPT4All.retrieve_model(model_name, model_path, allow_download, verbose) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Cellar/llm/0.13.1/libexec/lib/python3.12/site-packages/gpt4all/gpt4all.py", line 273, in retrieve_model raise FileNotFoundError(f"Model file does not exist: {model_dest!r}") FileNotFoundError: Model file does not exist: PosixPath('/Users/JDMS/.cache/gpt4all/mistral-7b-openorca.gguf2.Q4_0.gguf')

gjnave commented 3 months ago

the guy in the thread before you solved this: Go to your gpt4all.py file (mentioned in the error) and change this:

def is_installed(self):
    try:
        GPT4All.retrieve_model(
            self._details["filename"], allow_download=False, verbose=False
        )
        return True

(remove the - and add the + )

offbyone commented 3 months ago

Interestingly, I actually see the same error when I run the tests at the tip of main:

$ pip install .[test]
$ pytest
TEST OUTPUT HERE...
...
        if not model_path.exists():
            raise FileNotFoundError(f"Model directory does not exist: {model_path!r}")

        model_dest = model_path / model_filename
        if model_dest.exists():
            config["path"] = str(model_dest)
            if verbose:
                print(f"Found model file at {str(model_dest)!r}", file=sys.stderr)
        elif allow_download:
            # If model file does not exist, download
            filesize = config.get("filesize")
            config["path"] = str(cls.download_model(
                model_filename, model_path, verbose=verbose, url=config.get("url"),
                expected_size=None if filesize is None else int(filesize), expected_md5=config.get("md5sum"),
            ))
        else:
>           raise FileNotFoundError(f"Model file does not exist: {model_dest!r}")
E           FileNotFoundError: Model file does not exist: PosixPath('/Users/offby1/.cache/gpt4all/ggml-model-gpt4all-falcon-q4_0.bin')

.venv/lib/python3.10/site-packages/gpt4all/gpt4all.py:273: FileNotFoundError
============================================ warnings summary ============================================
.venv/lib/python3.10/site-packages/pydantic/_internal/_config.py:272
.venv/lib/python3.10/site-packages/pydantic/_internal/_config.py:272
  /Users/offby1/projects/llm-gpt4all/.venv/lib/python3.10/site-packages/pydantic/_internal/_config.py:272: PydanticDeprecatedSince20: Support for class-based `config` is deprecated, use ConfigDict instead. Deprecated in Pydantic V2.0 to be removed in V3.0. See Pydantic V2 Migration Guide at https://errors.pydantic.dev/2.6/migration/
    warnings.warn(DEPRECATION_MESSAGE, DeprecationWarning)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
======================================== short test summary info =========================================
FAILED tests/test_llm_gpt4all.py::test_llm_models - AssertionError:
FAILED tests/test_llm_gpt4all.py::test_conversation_prompt_blocks[ggml-gpt4all-j-v1-expected_blocks0] - FileNotFoundError: Model file does not exist: PosixPath('/Users/offby1/.cache/gpt4all/ggml-model-gpt4...
FAILED tests/test_llm_gpt4all.py::test_conversation_prompt_blocks[orca-mini-7b-expected_blocks1] - FileNotFoundError: Model file does not exist: PosixPath('/Users/offby1/.cache/gpt4all/ggml-model-gpt4...
FAILED tests/test_llm_gpt4all.py::test_conversation_prompt_blocks[ggml-mpt-7b-chat-expected_blocks2] - FileNotFoundError: Model file does not exist: PosixPath('/Users/offby1/.cache/gpt4all/ggml-model-gpt4...
FAILED tests/test_llm_gpt4all.py::test_conversation_prompt_blocks[ggml-model-gpt4all-falcon-q4_0-expected_blocks3] - FileNotFoundError: Model file does not exist: PosixPath('/Users/offby1/.cache/gpt4all/ggml-model-gpt4...
================================ 5 failed, 1 passed, 2 warnings in 0.16s =================================
jdorri commented 3 months ago

Thanks for the comments! I can change gpt4all.py if installing llm as a pure python package via pip and from source. But what about if I'm installing llm with brew?

jwhowa commented 3 months ago

Thanks for the comments! I can change gpt4all.py if installing llm as a pure python package via pip and from source. But what about if I'm installing llm with brew?

Same question... Installing via Brew, not a pip install from source. Same exact error.

boldandbusted commented 3 months ago

Better formatted spot-fix in #23.