using latest app versions on Nextcloud 27.1.3
Assistant 1.0.1
OpenAI and LocalAi integration 1.1.1
Local large Language model 1.2.0
Irrespective of which model I select LLama, Falcon or Leo-Hessian my Nextcloud assistant requests always fail with
[llm] Warning: /var/www/nextcloud/apps/llm/python/lib/python3.10/site-packages/langchain/__init__.py:40: UserWarning: Importing BasePromptTemplate from langchain root module is no longer supported.
warnings.warn(
/var/www/nextcloud/apps/llm/python/lib/python3.10/site-packages/langchain/__init__.py:40: UserWarning: Importing PromptTemplate from langchain root module is no longer supported.
warnings.warn(
Traceback (most recent call last):
File "/var/www/nextcloud/apps/llm/src-py/index.py", line 23, in <module>
llm = GPT4All(model=dir_path + "/models/"+args.model, n_threads=int(threads))
File "/var/www/nextcloud/apps/llm/python/lib/python3.10/site-packages/langchain/load/serializable.py", line 97, in __init__
super().__init__(**kwargs)
File "/var/www/nextcloud/apps/llm/python/lib/python3.10/site-packages/pydantic/v1/main.py", line 341, in __init__
raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for GPT4All
__root__
Failed to retrieve model (type=value_error)
I have tried reinstalling dependencies with repair, restarting server but I cannot seem to fix the issue
Hi,
using latest app versions on Nextcloud 27.1.3 Assistant 1.0.1 OpenAI and LocalAi integration 1.1.1 Local large Language model 1.2.0
Irrespective of which model I select LLama, Falcon or Leo-Hessian my Nextcloud assistant requests always fail with
I have tried reinstalling dependencies with repair, restarting server but I cannot seem to fix the issue