nextcloud / llm

A Nextcloud app that packages a large language model (Llama2 / GPT4All Falcon)
24 stars 2 forks source link

Running LLM locally fails with Validation Error 1 GPT4All #39

Closed john-2000 closed 11 months ago

john-2000 commented 12 months ago

Hi,

using latest app versions on Nextcloud 27.1.3 Assistant 1.0.1 OpenAI and LocalAi integration 1.1.1 Local large Language model 1.2.0

Irrespective of which model I select LLama, Falcon or Leo-Hessian my Nextcloud assistant requests always fail with

[llm] Warning: /var/www/nextcloud/apps/llm/python/lib/python3.10/site-packages/langchain/__init__.py:40: UserWarning: Importing BasePromptTemplate from langchain root module is no longer supported.
  warnings.warn(
/var/www/nextcloud/apps/llm/python/lib/python3.10/site-packages/langchain/__init__.py:40: UserWarning: Importing PromptTemplate from langchain root module is no longer supported.
  warnings.warn(
Traceback (most recent call last):
  File "/var/www/nextcloud/apps/llm/src-py/index.py", line 23, in <module>
    llm = GPT4All(model=dir_path + "/models/"+args.model, n_threads=int(threads))
  File "/var/www/nextcloud/apps/llm/python/lib/python3.10/site-packages/langchain/load/serializable.py", line 97, in __init__
    super().__init__(**kwargs)
  File "/var/www/nextcloud/apps/llm/python/lib/python3.10/site-packages/pydantic/v1/main.py", line 341, in __init__
    raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for GPT4All
__root__
  Failed to retrieve model (type=value_error)

I have tried reinstalling dependencies with repair, restarting server but I cannot seem to fix the issue

reizeix commented 12 months ago

I had the same issue. Reinstalling the older version LLM 1.1.0 fixed it for me.

marcelklehr commented 11 months ago

Should be fixed with v1.2.1