simonw / llm-gpt4all

Plugin for LLM adding support for the GPT4All collection of models
Apache License 2.0
218 stars 20 forks source link

`GLIBC_2.32` Error After Installing `llm-gpt4all` Plugin in GitHub Codespaces - Seeking Help to Get It Working #41

Open hugobowne opened 2 months ago

hugobowne commented 2 months ago

hi!

I encountered an issue when using the llm-gpt4all plugin in GitHub Codespaces. After installing the plugin, I'm unable to execute any llm commands, even unrelated ones. Here's a summary of the problem and my goal to get llm-gpt4all working in a Codespace.

Steps to Reproduce:

  1. Set up a GitHub Codespace with Python 3.10.
  2. Install the llm utility and confirm it works correctly (llm models list runs as expected).
  3. Install the llm-gpt4all plugin using llm install llm-gpt4all.
  4. After the installation, run any llm command (e.g., llm models list).

Expected Behavior:

The llm commands should continue working, with the llm-gpt4all plugin providing additional functionality.

Observed Behavior:

After the installation, all llm commands fail. Here is the error I receive:

llm models list
Traceback (most recent call last):
  File "/home/codespace/.python/current/bin/llm", line 5, in <module>
    from llm.cli import cli
  File "/usr/local/python/3.10.13/lib/python3.10/site-packages/llm/__init__.py", line 18, in <module>
    from .plugins import pm
  File "/usr/local/python/3.10.13/lib/python3.10/site-packages/llm/plugins.py", line 17, in <module>
    pm.load_setuptools_entrypoints("llm")
  File "/usr/local/python/3.10.13/lib/python3.10/site-packages/pluggy/_manager.py", line 421, in load_setuptools_entrypoints
    plugin = ep.load()
  File "/usr/local/python/3.10.13/lib/python3.10/importlib/metadata/__init__.py", line 171, in load
    module = import_module(match.group('module'))
  File "/usr/local/python/3.10.13/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/usr/local/python/3.10.13/lib/python3.10/site-packages/llm_gpt4all.py", line 1, in <module>
    from gpt4all import GPT4All as _GPT4All
  File "/usr/local/python/3.10.13/lib/python3.10/site-packages/gpt4all/__init__.py", line 1, in <module>
    from .gpt4all import CancellationError as CancellationError, Embed4All as Embed4All, GPT4All as GPT4All
  File "/usr/local/python/3.10.13/lib/python3.10/site-packages/gpt4all/gpt4all.py", line 23, in <module>
    from ._pyllmodel import (CancellationError as CancellationError, EmbCancelCallbackType, EmbedResult as EmbedResult,
  File "/usr/local/python/3.10.13/lib/python3.10/site-packages/gpt4all/_pyllmodel.py", line 91, in <module>
    llmodel = load_llmodel_library()
  File "/usr/local/python/3.10.13/lib/python3.10/site-packages/gpt4all/_pyllmodel.py", line 81, in load_llmodel_library
    lib = ctypes.CDLL(str(MODEL_LIB_PATH / f"libllmodel.{ext}"))
  File "/usr/local/python/3.10.13/lib/python3.10/ctypes/__init__.py", line 374, in __init__
    self._handle = _dlopen(self._name, mode)
OSError: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /usr/local/python/3.10.13/lib/python3.10/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllmodel.so)

Environment Details:

Goal:

I would like to get the llm-gpt4all plugin working inside a GitHub Codespace environment. I understand that the GLIBC_2.32 error is likely due to a mismatch between the environment's glibc version and the plugin's requirements, but I'd like to ask for your advice on how to make this work within a Codespace.

Request for Help:

Thank you!