Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with full RAG and AI Agent capabilities.
https://useanything.com
MIT License
17k stars 1.82k forks source link

[BUG]: ArchLinux: Fail to move files into workspace: Error: Failed to fetch #1085

Closed ChaRoSaMa closed 2 months ago

ChaRoSaMa commented 2 months ago

How are you running AnythingLLM?

AnythingLLM desktop app

What happened?

Screenshot_20240412_061642

OS: Arch linux amd64 kernel: 6.3.6-arch1-1 LLM: LocalLLM but an online api provider Embedding Provider: AnythingLLM Embedder (LocalLLM with an online api have tried, but also failed) DataBase: LanceDB

It seems not an IP error. I have tried many IPs, but none of them work. And I have tried English txt and Chinese txt, both failed (Perhaps not an error on the embedding provider, because LocalLLM with an online api have tried. By the way, does AnythingLLM Embedder supports Chinese?)

When I moved a file to workspace, the console didn't print anything. But when start it threw an error, followings are the complete console output: """ Updating location Prisma binaries for linux builds. Prisma schema loaded from prisma/schema.prisma Datasource "db": SQLite database "anythingllm.db" at "file:/home/CharSaMa/.config/anythingllm-desktop/storage/anythingllm.db"

Already in sync, no schema change or pending migration was found.

EROFS: read-only file system, unlink '/tmp/.mount_AnythibuCKka/resources/backend/node_modules/.prisma/client/index.js'

Prisma schema loaded from prisma/schema.prisma Error: EROFS: read-only file system, unlink '/tmp/.mount_AnythibuCKka/resources/backend/node_modules/.prisma/client/index.js' """

Are there known steps to reproduce?

No response

timothycarambat commented 2 months ago

In ~/.config/anythingllm-desktop (or where Arch writes AppImage data) there should be a storage/models/Xenova folder. Does that exist for you? Could be related to https://github.com/Mintplex-Labs/anything-llm/issues/821, but we added a fallback URL to prevent this. However, it is possible you cannot reach either URL for downloading the embedder model to run locally.

If there is no error to the terminal when moving the document over, that would indicate that the frontend is failing to even reach the backend. The boot messages shown do not seem to indicate that it crashed or was otherwise unable to boot.

You can check to see if the server is running by curl HTTP://localhost:3001/api/ping while the app is running

ChaRoSaMa commented 2 months ago

The folder doesn't exist indeed.

I am using clash as proxy. When I turn off proxy, it responds normally "{"online":true}", but when I turn on it, it doesn't have any response(though maybe it's not the cause of my problem). I will try different IP later.

I have tried all the IPs I can get, stiil not working. there's still a possibility that just only the IPs I get can't work

ChaRoSaMa commented 2 months ago

I have followed the #821 and tried to create the folder".config/anythingllm-desktop/storage/models/Xenova/all-MiniLM-L6-v2/", but didn't work either. What's worse, I tried to conduct a chat, but failed, the same problem as #946 . I previously used docker to set up anything-llm, the chat feature works normally, but when I just apply the same setting into the desktop application, it doesn't work. So I am puzzled now.

timothycarambat commented 2 months ago

At this point, with the influx of issues across the multitude of Linux distros we may just drop the desktop appimage and just point to use docker since that consistently works as making sure the correct binaries exist for not only the OS, but also its version, is causing so many issues that are frankly impossible to debug remotely.

The only solution for the appimage, which does work is unpacking it, running npm install, and running the image unpacked. https://github.com/Mintplex-Labs/anything-llm/issues/1077#issuecomment-2048481246

ChaRoSaMa commented 2 months ago

Wow... so I'd be appreciate if you can add embed feature in the docker version

timothycarambat commented 2 months ago

@ChaRoSaMa you already can embed with the built-in embedded on Docker. The only thing the Docker app does not have that desktop does is the built-in LLM.

ChaRoSaMa commented 2 months ago

soooorry.. I just misspoke. I mean the document uploading feature...

But by the way thanks for awesome application you built