Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
23.17k stars 2.35k forks source link

[BUG]: Error occurred while streaming response. Streaming error #2302

Open saisandeepbalbari opened 1 week ago

saisandeepbalbari commented 1 week ago

How are you running AnythingLLM?

Docker (remote machine)

What happened?

I'm using OLLAMA's nomic-embed-text as the embedding model and llama3.1:8b as the LLM.

After I upload the documents, when asked a question, it is generating the sentence partially and then giving the streaming error

Screenshot 2024-09-17 at 12 32 16 PM

On checking the logs of the docker are as below:

[Event Logged] - workspace_documents_added [OllamaEmbedder] Embedding 1 chunks of text with nomic-embed-text:latest. [STREAM ABORTED] Client requested to abort stream. Exiting LLM stream handler early. [TELEMETRY SENT] { event: 'sent_chat', distinctId: 'xxxxxxxx-xxxx-xxxxx-xxxxxx-xxxxxxxxxxxxxxxx', properties: { multiUserMode: false, LLMSelection: 'ollama', Embedder: 'ollama', VectorDbSelection: 'lancedb', runtime: 'docker' } } [Event Logged] - sent_chat

Are there known steps to reproduce?

No response

timothycarambat commented 1 week ago

Client requested to abort stream. Exiting LLM stream handler early.

This is certainly something on the client side then which would occur from:

What does frontend inspector > source show? Should be some relevant error there since this is the client dropping the connection and not the server aborting the connection