Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
27.96k stars 2.83k forks source link

[BUG]: Unusable for me. Something about prisma client. #931

Closed nmcbride closed 8 months ago

nmcbride commented 8 months ago

How are you running AnythingLLM?

Docker (local)

What happened?

Downloaded the appimage, tried to run it.

Selected lmstudio and the included embedder.

Would not create a workspace just kept saying undefined. Could not close it because it opens full screen with no controls. Had to force close it.

Re-opening it showed what I can only assume was the main window. Tried to create a workspace again but it errored.

I see this error:

Prisma Client could not locate the Query Engine for runtime "rhel-openssl-3.0.x".

This happened because Prisma Client was generated for "debian-openssl-3.0.x", but the actual deployment required "rhel-openssl-3.0.x".
Add "rhel-openssl-3.0.x" to `binaryTargets` in the "schema.prisma" file and run `prisma generate` after saving it:

generator client {
  provider      = "prisma-client-js"
  binaryTargets = ["native", "rhel-openssl-3.0.x"]
}

The following locations have been searched:
  /tmp/.mount_gearle8zoSqp/resources/backend/node_modules/.prisma/client
  /tmp/.mount_gearle8zoSqp/resources/backend/node_modules/@prisma/client
  /home/tim/Documents/anything-llm-desktop/anything-llm/server/node_modules/@prisma/client
  /tmp/prisma-engines
  /tmp/.mount_gearle8zoSqp/resources/backend/prisma

Are there known steps to reproduce?

No response

timothycarambat commented 8 months ago

Duplicate of https://github.com/Mintplex-Labs/anything-llm/issues/898

The solution is in there. Basically need to unbundle app image, run a command, and it will work. Has to do with a build issue on bundling. It is being working on in that issue.

nmcbride commented 8 months ago

Gotcha thanks. I'll just use a different solution for now until there is a non-broken build.