Open christiancadieux opened 1 year ago
can prevent localAI from crashing by increasing the chunk size
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me, I don't support (yet) replying!
The issue seems to be related to a crash occurring when running the LocalAI code. The error message indicates that the code is trying to create chunks of size larger than 300, which may be causing the crash. To reproduce the behavior, you can run the store.py
script from the examples/langchain-chroma directory and specify the --debug
flag to enable debug mode. The logs generated by the code may provide more information about the cause of the crash. Additionally, checking the network connectivity between the LocalAI container and the OpenAI API may also help in diagnosing the issue.
Sources:
LocalAI version: latest version quay.io/go-skynet/local-ai:latest on jun 19 6pm MT
Environment, CPU architecture, OS, and Version: Linux rdei-local-ai-5f8fc75c56-z5lfb 5.4.77-flatcar #1 SMP Wed Nov 18 17:29:43 -00 2020 x86_64 GNU/Linux
Describe the bug localAI log at the crash:
CRASH To Reproduce
called store.py from examples/langchain-chroma
Expected behavior
Logs
Additional context