Open saisandeepbalbari opened 1 week ago
Client requested to abort stream. Exiting LLM stream handler early.
This is certainly something on the client side then which would occur from:
What does frontend inspector > source show? Should be some relevant error there since this is the client dropping the connection and not the server aborting the connection
How are you running AnythingLLM?
Docker (remote machine)
What happened?
I'm using OLLAMA's nomic-embed-text as the embedding model and llama3.1:8b as the LLM.
After I upload the documents, when asked a question, it is generating the sentence partially and then giving the streaming error
On checking the logs of the docker are as below:
[Event Logged] - workspace_documents_added [OllamaEmbedder] Embedding 1 chunks of text with nomic-embed-text:latest. [STREAM ABORTED] Client requested to abort stream. Exiting LLM stream handler early. [TELEMETRY SENT] { event: 'sent_chat', distinctId: 'xxxxxxxx-xxxx-xxxxx-xxxxxx-xxxxxxxxxxxxxxxx', properties: { multiUserMode: false, LLMSelection: 'ollama', Embedder: 'ollama', VectorDbSelection: 'lancedb', runtime: 'docker' } } [Event Logged] - sent_chat
Are there known steps to reproduce?
No response