simonw / llm-gpt4all

Plugin for LLM adding support for the GPT4All collection of models
Apache License 2.0
194 stars 19 forks source link

libllama.so not found #19

Open dlq opened 7 months ago

dlq commented 7 months ago
> uname -a
Linux [...] 3.10.0-1160.95.1.el7.x86_64 #1 SMP Mon Jul 24 13:59:37 UTC 2023 x86_64 GNU/Linux
> llm install llm-gpt4all
[...]
Successfully installed anyio-4.1.0 gpt4all-0.1.7 h11-0.14.0+computecanada httpcore-1.0.2 httpx-0.25.2 iniconfig-2.0.0+computecanada llm-gpt4all-0.1.1 packaging-23.2+computecanada pytest-7.3.1 sniffio-1.3.0+computecanada

The +computecanada wheels are locally compiled wheels.

> llm -m orca-mini-3b-gguf2-q4_0 'What is the capital of France?'
[...]
OSError: [...]/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.so: cannot open shared object file: No such file or directory

llm now always produces an error whether or not I use a GPT4ALL model.

All this works fine on my Mac but not on this Linux machine :-(

dlq commented 7 months ago

This is probably related to #14.

djswagerman commented 7 months ago

I have the same issue running llm-gpt4all in a linux docker container. In /usr/local/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/ there are .dylib versions of the required libs:

OSError: /usr/local/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.so: cannot open shared object file: No such file or directory

ls /usr/local/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/

libllama.dylib libllmodel.dylib

.dylib is for macOS, .so is for linux

So the issue could be that llm-gpt4all installs the macOS formatted libs in a linux env?

dlq commented 7 months ago

I just tried this in a new new Ubuntu docker container too and I think you're right.

Manamama commented 5 months ago

Not sure if it is related

OSError: dlopen failed: library "/data/data/com.termux/files/usr/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllmodel.so" not found
~/downloads $ 

in Termux.

See more pastes and details here.

varenc commented 4 months ago

Just ran into a similar issue.

After running llm install llm-gpt4all now my llm installation is broken. Every llm command gets this error:

Traceback (most recent call last):
  File "/Users/chris/.pyenv/versions/3.12.2/bin/llm", line 5, in <module>
    from llm.cli import cli
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/llm/__init__.py", line 18, in <module>
    from .plugins import pm
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/llm/plugins.py", line 17, in <module>
    pm.load_setuptools_entrypoints("llm")
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/pluggy/_manager.py", line 414, in load_setuptools_entrypoints
    plugin = ep.load()
             ^^^^^^^^^
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/importlib/metadata/__init__.py", line 205, in load
    module = import_module(match.group('module'))
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/llm_gpt4all.py", line 1, in <module>
    from gpt4all import GPT4All as _GPT4All
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/gpt4all/__init__.py", line 1, in <module>
    from .gpt4all import Embed4All, GPT4All  # noqa
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/gpt4all/gpt4all.py", line 13, in <module>
    from . import pyllmodel
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/gpt4all/pyllmodel.py", line 48, in <module>
    llmodel = load_llmodel_library()
              ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/gpt4all/pyllmodel.py", line 43, in load_llmodel_library
    llmodel_lib = ctypes.CDLL(llmodel_dir)
                  ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/ctypes/__init__.py", line 379, in __init__
    self._handle = _dlopen(self._name, mode)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: dlopen(/Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllmodel.dylib, 6): no suitable image found.  Did find:
    /Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllmodel.dylib: cannot load 'libllmodel.dylib' (load command 0x80000034 is unknown)
    /Users/chris/.pyenv/versions/3.12.2/lib/python3.12/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllmodel.dylib: cannot load 'libllmodel.dylib' (load command 0x80000034 is unknown)

Running on an older intel mac mini still running macOS 10.14 Mojave. I suspect that problem is that Mojave isn't aware of binaries with combined arm64 and x86_64 architectures which libllmodel.dylib is. I 'fixed' things for now by just doing pip uninstall llm-gpt4all