Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
24.95k stars 2.52k forks source link

[BUG]: Could not respond to message. fetch failed #1608

Closed mingLvft closed 4 months ago

mingLvft commented 4 months ago

How are you running AnythingLLM?

Docker (local)

What happened?

Could not respond to message. fetch failed

Are there known steps to reproduce?

No response

mingLvft commented 4 months ago

How do I enable the debug model How do we find out why

mingLvft commented 4 months ago

微信截图_20240605135312 微信截图_20240605135302

shatfield4 commented 4 months ago

Since you are using the docker version, open the container's logs in docker desktop and and then try to send a message to the agent again. It should print errors to the console so we can trace why this is happening.

mingLvft commented 4 months ago

[NativeEmbedder] Initialized [NativeEmbedder] Embedded Chunk 1 of 1 [Event Logged] - sent_chat [NativeEmbedder] Initialized [NativeEmbedder] Embedded Chunk 1 of 1 [Event Logged] - sent_chat [Event Logged] - sent_chat [AgentHandler] Start 18005a1d-0715-4e5c-9354-2a9a2326ca91::ollama:llama3-gradient:70b-instruct-1048k-q2_K [AgentHandler] Attached websocket plugin to Agent cluster [AgentHandler] Attached chat-history plugin to Agent cluster [AgentHandler] Attaching user and default agent to Agent cluster. [AgentHandler] Attached rag-memory plugin to Agent cluster [AgentHandler] Attached document-summarizer plugin to Agent cluster [AgentHandler] Attached web-scraping plugin to Agent cluster [AgentLLM - llama3-gradient:70b-instruct-1048k-q2_K] Invalid function tool call: Missing name or arguments in function call.. [AgentLLM - llama3-gradient:70b-instruct-1048k-q2_K] Will assume chat completion without tool call inputs. fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:12618:11 at async post (/app/server/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:81:20) at async Ollama.processStreamableRequest (/app/server/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:183:22) at async OllamaProvider.complete (/app/server/utils/agents/aibitat/providers/ollama.js:75:26) at async AIbitat.handleExecution (/app/server/utils/agents/aibitat/index.js:580:24) at async AIbitat.reply (/app/server/utils/agents/aibitat/index.js:562:21) at async AIbitat.chat (/app/server/utils/agents/aibitat/index.js:373:15) at async AIbitat.start (/app/server/utils/agents/aibitat/index.js:307:5) at async /app/server/endpoints/agentWebsocket.js:52:7 { cause: SocketError: other side closed at Socket.onSocketEnd (node:internal/deps/undici/undici:9169:26) at Socket.emit (node:events:529:35) at endReadableNT (node:internal/streams/readable:1400:12) at process.processTicksAndRejections (node:internal/process/task_queues:82:21) { code: 'UND_ERR_SOCKET', socket: { localAddress: '172.20.0.2', localPort: 58466, remoteAddress: '172.17.0.1', remotePort: 11434, remoteFamily: 'IPv4', timeout: undefined, bytesWritten: 971, bytesRead: 0 } } } [AgentHandler] End 18005a1d-0715-4e5c-9354-2a9a2326ca91::ollama:llama3-gradient:70b-instruct-1048k-q2_K [Event Logged] - sent_chat [AgentHandler] Start b7defae2-12e3-4107-9181-f206a1f15075::ollama:llama3-gradient:70b-instruct-1048k-q2_K [AgentHandler] Attached websocket plugin to Agent cluster [AgentHandler] Attached chat-history plugin to Agent cluster [AgentHandler] Attaching user and default agent to Agent cluster. [AgentHandler] Attached rag-memory plugin to Agent cluster [AgentHandler] Attached document-summarizer plugin to Agent cluster [AgentHandler] Attached web-scraping plugin to Agent cluster [NativeEmbedder] Initialized [NativeEmbedder] Initialized [AgentLLM - llama3-gradient:70b-instruct-1048k-q2_K] Invalid function tool call: Missing name or arguments in function call.. [AgentLLM - llama3-gradient:70b-instruct-1048k-q2_K] Will assume chat completion without tool call inputs. [Event Logged] - workspace_thread_created [Event Logged] - sent_chat [AgentHandler] Start 2aa2794b-ea0c-4fee-8aa9-7b94cba94aa9::ollama:llama3-gradient:70b-instruct-1048k-q2_K [AgentHandler] Attached websocket plugin to Agent cluster [AgentHandler] Attached chat-history plugin to Agent cluster [AgentHandler] Attaching user and default agent to Agent cluster. [AgentHandler] Attached rag-memory plugin to Agent cluster [AgentHandler] Attached document-summarizer plugin to Agent cluster [AgentHandler] Attached web-scraping plugin to Agent cluster [AgentLLM - llama3-gradient:70b-instruct-1048k-q2_K] Invalid function tool call: Missing name or arguments in function call.. [AgentLLM - llama3-gradient:70b-instruct-1048k-q2_K] Will assume chat completion without tool call inputs. fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:12618:11 at async post (/app/server/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:81:20) at async Ollama.processStreamableRequest (/app/server/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:183:22) at async OllamaProvider.complete (/app/server/utils/agents/aibitat/providers/ollama.js:75:26) at async AIbitat.handleExecution (/app/server/utils/agents/aibitat/index.js:580:24) at async AIbitat.reply (/app/server/utils/agents/aibitat/index.js:562:21) at async AIbitat.chat (/app/server/utils/agents/aibitat/index.js:373:15) at async AIbitat.start (/app/server/utils/agents/aibitat/index.js:307:5) at async /app/server/endpoints/agentWebsocket.js:52:7 { cause: SocketError: other side closed at Socket.onSocketEnd (node:internal/deps/undici/undici:9169:26) at Socket.emit (node:events:529:35) at endReadableNT (node:internal/streams/readable:1400:12) at process.processTicksAndRejections (node:internal/process/task_queues:82:21) { code: 'UND_ERR_SOCKET', socket: { localAddress: '172.20.0.2', localPort: 33752, remoteAddress: '172.17.0.1', remotePort: 11434, remoteFamily: 'IPv4', timeout: undefined, bytesWritten: 615, bytesRead: 0 } } } [AgentHandler] End 2aa2794b-ea0c-4fee-8aa9-7b94cba94aa9::ollama:llama3-gradient:70b-instruct-1048k-q2_K [Event Logged] - sent_chat [AgentHandler] Start d61dd88d-c3d6-44e7-a23d-9c5fc3367db7::ollama:llama3-gradient:70b-instruct-1048k-q2_K [AgentHandler] Attached websocket plugin to Agent cluster [AgentHandler] Attached chat-history plugin to Agent cluster [AgentHandler] Attaching user and default agent to Agent cluster. [AgentHandler] Attached rag-memory plugin to Agent cluster [AgentHandler] Attached document-summarizer plugin to Agent cluster [AgentHandler] Attached web-scraping plugin to Agent cluster [AgentLLM - llama3-gradient:70b-instruct-1048k-q2_K] Invalid function tool call: Missing name or arguments in function call.. [AgentLLM - llama3-gradient:70b-instruct-1048k-q2_K] Will assume chat completion without tool call inputs. Client took too long to respond, chat thread is dead after 300000ms [AgentHandler] End b7defae2-12e3-4107-9181-f206a1f15075::ollama:llama3-gradient:70b-instruct-1048k-q2_K [NativeEmbedder] Initialized [NativeEmbedder] Initialized [Event Logged] - update_embedding_engine [Event Logged] - workspace_documents_removed -- Working 文章内容 - 副本.txt -- [SUCCESS]: 文章内容 - 副本.txt converted & ready for embedding.

[CollectorApi] Document 文章内容 - 副本.txt uploaded processed and successfully. It is now available in documents. [Event Logged] - document_uploaded Adding new vectorized document into namespace zhang_test [RecursiveSplitter] Will split with { chunkSize: 100000, chunkOverlap: 20 } Chunks created from document: 1 [OllamaEmbedder] Embedding 1 chunks of text with llama3-gradient:70b-instruct-1048k-q2_K. Inserting vectorized chunks into LanceDB collection. Caching vectorized results of custom-documents/.txt-f6a1ffc9-3498-4b6d-9fe4-f55b609f4031.json to prevent duplicated embedding. [Event Logged] - workspace_documents_added Client took too long to respond, chat thread is dead after 300000ms [AgentHandler] End d61dd88d-c3d6-44e7-a23d-9c5fc3367db7::ollama:llama3-gradient:70b-instruct-1048k-q2_K [Event Logged] - workspace_thread_created [Event Logged] - sent_chat [AgentHandler] Start 85283e0f-3d13-4f5c-b4fa-8a890fc7b530::ollama:llama3-gradient:70b-instruct-1048k-q2_K [AgentHandler] Attached websocket plugin to Agent cluster [AgentHandler] Attached chat-history plugin to Agent cluster [AgentHandler] Attaching user and default agent to Agent cluster. [AgentHandler] Attached rag-memory plugin to Agent cluster [AgentHandler] Attached document-summarizer plugin to Agent cluster [AgentHandler] Attached web-scraping plugin to Agent cluster [AgentLLM - llama3-gradient:70b-instruct-1048k-q2_K] Invalid function tool call: Missing name or arguments in function call.. [AgentLLM - llama3-gradient:70b-instruct-1048k-q2_K] Will assume chat completion without tool call inputs. node:internal/deps/undici/undici:12618 Error.captureStackTrace(err, this); ^

TypeError: fetch failed at node:internal/deps/undici/undici:12618:11 at async post (/app/server/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:81:20) at async Ollama.processStreamableRequest (/app/server/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:183:22) at async #handleFunctionCallChat (/app/server/utils/agents/aibitat/providers/ollama.js:29:22) at async OllamaProvider.functionCall (/app/server/utils/agents/aibitat/providers/helpers/untooled.js:110:22) at async OllamaProvider.complete (/app/server/utils/agents/aibitat/providers/ollama.js:50:36) at async AIbitat.handleExecution (/app/server/utils/agents/aibitat/index.js:580:24) at async AIbitat.reply (/app/server/utils/agents/aibitat/index.js:562:21) at async AIbitat.chat (/app/server/utils/agents/aibitat/index.js:373:15) at async AIbitat.continue (/app/server/utils/agents/aibitat/index.js:672:7) { cause: HeadersTimeoutError: Headers Timeout Error at Timeout.onParserTimeout [as callback] (node:internal/deps/undici/undici:9117:32) at Timeout.onTimeout [as _onTimeout] (node:internal/deps/undici/undici:7148:17) at listOnTimeout (node:internal/timers:569:17) at process.processTimers (node:internal/timers:512:7) { code: 'UND_ERR_HEADERS_TIMEOUT' } }

Node.js v18.20.2 Collector hot directory and tmp storage wiped! Document processor app listening on port 8888 Environment variables loaded from .env Prisma schema loaded from prisma/schema.prisma

✔ Generated Prisma Client (v5.3.1) to ./node_modules/@prisma/client in 177ms

Start using Prisma Client in Node.js (See: https://pris.ly/d/client)

import { PrismaClient } from '@prisma/client'
const prisma = new PrismaClient()

or start using Prisma Client at the edge (See: https://pris.ly/d/accelerate)

import { PrismaClient } from '@prisma/client/edge'
const prisma = new PrismaClient()

See other ways of importing Prisma Client: http://pris.ly/d/importing-client

npm notice npm notice New minor version of npm available! 10.5.0 -> 10.8.1 npm notice Changelog: https://github.com/npm/cli/releases/tag/v10.8.1 npm notice Run npm install -g npm@10.8.1 to update! npm notice Environment variables loaded from .env Prisma schema loaded from prisma/schema.prisma Datasource "db": SQLite database "anythingllm.db" at "file:../storage/anythingllm.db"

20 migrations found in prisma/migrations

No pending migrations to apply. ┌─────────────────────────────────────────────────────────┐ │ Update available 5.3.1 -> 5.15.0 │ │ Run the following to update │ │ npm i --save-dev prisma@latest │ │ npm i @prisma/client@latest │ └─────────────────────────────────────────────────────────┘ [TELEMETRY DISABLED] Telemetry is marked as disabled - no events will send. Telemetry helps Mintplex Labs Inc improve AnythingLLM. [CommunicationKey] RSA key pair generated for signed payloads within AnythingLLM services. Primary server in HTTP mode listening on port 3001 prisma:info Starting a sqlite pool with 13 connections. [OllamaEmbedder] Embedding 1 chunks of text with llama3-gradient:70b-instruct-1048k-q2_K. [Error: LanceDBError: No vector column found to create index] [OllamaEmbedder] Embedding 1 chunks of text with llama3-gradient:70b-instruct-1048k-q2_K. [Error: LanceDBError: No vector column found to create index] [DocumentManager] Found 1 pinned sources - prepending to content with ~43449 tokens of content. [OllamaEmbedder] Embedding 1 chunks of text with llama3-gradient:70b-instruct-1048k-q2_K. Error: Ollama Failed to embed: [undefined]: undefined at OllamaEmbedder.embedChunks (/app/server/utils/EmbeddingEngines/ollama/index.js:101:24) at async OllamaEmbedder.embedTextInput (/app/server/utils/EmbeddingEngines/ollama/index.js:33:20) at async OllamaAILLM.embedTextInput (/app/server/utils/AiProviders/ollama/index.js:197:12) at async Object.performSimilaritySearch (/app/server/utils/vectorDbProviders/lance/index.js:276:25) at async streamChatWithWorkspace (/app/server/utils/chats/stream.js:133:9) at async /app/server/endpoints/chat.js:84:9

UUSR commented 4 months ago

I have the same thing under Windows 11

2024-06-06_094059.png

Any help is welcome.

UUSR commented 4 months ago

In this screenshot, the error occurs when the model tries to access the Internet.

UUSR commented 4 months ago

2024-06-06_104850.png

and this is an error when trying to access the local database in the workspace.

timothycarambat commented 4 months ago

This is coming from your connection to Ollama:

fetch failed TypeError: fetch failed
at node:internal/deps/undici/undici:12618:11
at async post (/app/server/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:81:20)

So something is wrong with however you are connecting to Ollama. Are you able to send a chat to an empty workspace with using @agent?

mingLvft commented 4 months ago

这是来自您与Ollama的决定:

fetch failed TypeError: fetch failed
at node:internal/deps/undici/undici:12618:11
at async post (/app/server/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:81:20)

因此,无论您如何连接 Ollama,都存在问题。您可以使用聊天应用程序在空白工作区吗@agent

@agent can be sent normally

chukaonline commented 4 months ago

I am also getting the same error when I try to query the embedded document. I am running AnythingLLM on Windows 10 and my LLM provider is also AnythingLLM NOT Ollama

vlmakarov commented 3 months ago

I am getting the same exact error for any query. Windows 10, Anything LLM is the provider, everything local

LeviMarvin commented 3 months ago

I have the same issue on ArchLinux. Anything LLM Desktop v1.5.10

timothycarambat commented 3 months ago

This is still because your LLM provider is not able to be reached. If you are using AnythingLLM internal LLM and you get this issue it is because your computer is prevent the internal LLM from booting

LeviMarvin commented 3 months ago

@timothycarambat yes, you are right. I'm trying to use the local Ollama+llama3, and it works. Thank you.

phicha20224 commented 2 months ago

please help, i try to browse using Docker anythingllm with config Ollama but on docker logs it show these. Thank You.

[backend] info: [TELEMETRY SENT] {"event":"agent_chat_sent","distinctId":"3924f037-f36c-432e-9655-7b9d3459b5b6","properties":{"runtime":"docker"}} [backend] info: [AgentHandler] End db0d43e9-c683-4167-b307-dfc662cc0d3c::ollama:llama3:8b-instruct-q8_0 [backend] info: [TELEMETRY SENT] {"event":"sent_chat","distinctId":"3924f037-f36c-432e-9655-7b9d3459b5b6","properties":{"multiUserMode":false,"LLMSelection":"ollama","Embedder":"ollama","VectorDbSelection":"lancedb","multiModal":false,"runtime":"docker"}} [backend] info: [Event Logged] - sent_chat [backend] info: [TELEMETRY SENT] {"event":"sent_chat","distinctId":"3924f037-f36c-432e-9655-7b9d3459b5b6","properties":{"multiUserMode":false,"LLMSelection":"ollama","Embedder":"ollama","VectorDbSelection":"lancedb","multiModal":false,"runtime":"docker"}} [backend] info: [AgentHandler] Start 30cc19f9-ac24-4cc0-96dc-b4e04bc48271::ollama:llama3.1:8b-instruct-q8_0 [backend] info: [Event Logged] - sent_chat [backend] info: [TELEMETRY SENT] {"event":"agent_chat_started","distinctId":"3924f037-f36c-432e-9655-7b9d3459b5b6","properties":{"runtime":"docker"}} [backend] info: [AgentHandler] Attached websocket plugin to Agent cluster [backend] info: [AgentHandler] Attached chat-history plugin to Agent cluster [backend] info: [AgentHandler] Attaching user and default agent to Agent cluster. [backend] info: [AgentHandler] Attached rag-memory plugin to Agent cluster [backend] info: [AgentHandler] Attached document-summarizer plugin to Agent cluster [backend] info: [AgentHandler] Attached web-scraping plugin to Agent cluster [backend] info: [AgentHandler] Attached save-file-to-browser plugin to Agent cluster [backend] info: [AgentHandler] Attached create-chart plugin to Agent cluster [backend] info: [AgentHandler] Attached web-browsing plugin to Agent cluster [backend] info: [AgentHandler] Attached sql-agent:sql-list-databases plugin to Agent cluster [backend] info: [AgentHandler] Attached sql-agent:sql-list-tables plugin to Agent cluster [backend] info: [AgentHandler] Attached sql-agent:sql-get-table-schema plugin to Agent cluster [backend] info: [AgentHandler] Attached sql-agent:sql-query plugin to Agent cluster [backend] info: Client took too long to respond, chat thread is dead after 300000ms [backend] info: [AgentLLM - llama3.1:8b-instruct-q8_0] Invalid function tool call: Missing name or arguments in function call.. [backend] info: [AgentLLM - llama3.1:8b-instruct-q8_0] Will assume chat completion without tool call inputs.

timothycarambat commented 2 months ago

Client took too long to respond, chat thread is dead after 300000ms

Ollama is timing out it appears? Seems like you are running too large a model for your machine, set the context window to too large for your specs, or in general having the model hang.

timothycarambat commented 2 months ago

Locking this issue as if you review the above thread you will find this is a client-side issue and not a system bug. There are plenty of examples above would arm you with enough information to self-help with this issue.