Closed thinker007 closed 2 months ago
Just fixed the offline chat stream not being closed properly issue yesterday in https://github.com/khoj-ai/khoj/commit/5927ca803277901590466ac3deb621f5a0e67210.
Let me know if you're still seeing it in release >=1.21.3
Just fixed the offline chat stream not being closed properly issue yesterday in 5927ca8.
Let me know if you're still seeing it in release >=1.21.3
the same bug exists in https://github.com/khoj-ai/khoj/blob/master/src/khoj/processor/conversation/openai/utils.py
when throw exceptions whole application hangs
Ah, didn't catch that. Thanks for pointing it out. Let me fix that up in a bit
Describe the bug
A clear and concise description of what the bug is. Please include what you were expecting to happen vs. what actually happened.
To Reproduce
first httpx throw exception because pool connection to openai api server then client.chat.completions.create throw exception chat = client.chat.completions.create( stream=True, messages=formatted_messages, model=model_name, # type: ignore temperature=temperature, timeout=20, **(model_kwargs or dict()), ) finally: llm_thread not exit and causing the generator is not closed .
So the whole khoj server hangs
Screenshots
If applicable, add screenshots to help explain your problem.
Platform
If self-hosted
Additional context
Add any other context about the problem here.