nextcloud / llm

A Nextcloud app that packages a large language model (Llama2 / GPT4All Falcon)
24 stars 2 forks source link

LLM tasks are failing and there are no logs or any information #55

Open xyhhx opened 7 months ago

xyhhx commented 7 months ago

I'm trying to use the AI assistant for various tasks:

I've tried with both local LLM and OpenAI integration, and it fails in all cases. With the UI, I instantly get a toast-like notification on top-right saying "Assistant error" (nothing in actual notifications):

image

From the terminal, I get the following:

LLM process failed: process exited with code 1

In all cases, there is nothing in Administration Settings > Logging, and running docker logs nextclouid-aio-nextcloud just gives me

Task failed 5
Task failed 7

How am I supposed to troubleshoot this?

marcelklehr commented 7 months ago

hi! what versions of nextcloud, assistant, llm, etc?

xyhhx commented 7 months ago

I'm using:

Nextcloud 28.0.3 Assistant 1.0.6 LLM 1.2.1

wolfallein commented 7 months ago

Hi, Could you let me know if you managed to troubleshoot this? I get the same problem.

Using Docker container.

Nextcloud Hub 7 (28.0.3) Assistant 1.0.7 Local Large Language Model 1.2.1

image

xyhhx commented 7 months ago

@wolfallein no, I haven't. My settings are the same as yours. Still, there are no errors or relevant logs. Without any of this it's a nightmare to try to debug...

wolfallein commented 7 months ago

Hi, I found some helpful information here: https://help.nextcloud.com/t/ai-assistant-failing-resolved/172705

I ran: occ maintenance:repair

and got:

 - Install dependencies for llm app
     - ERROR: Failed to install python dependencies: Failed to create python venv: python3 -m venv ./python returned The virtual environment was not created successfully because ensurepip is not available.  On Debian/Ubuntu systems, you need to install the python3-venv package using the following command.                                                                                                                                                    

     apt install python3.11-venv

You may need to use sudo with that command.  After installing the python3-venv package, recreate your virtual environment.                                                                                                     
Failing command: /var/www/html/custom_apps/llm/python/bin/python3  

So, I installed the missing package inside the Docker container for testing. After that, I ran occ maintenance:repair again, which took some time.

Unfortunately, it still didn't fix the problem on my machine, but it is a step forward and maybe works for you.

xyhhx commented 7 months ago

Thanks for that @wolfallein - I found that thread earlier and have tried it already to no avail :face_exhaling:

In any case, I think the meaning of this ticket is more specifically that no logs or errors are produced rather than trying to get help fixing why my LLM isn't working.