khoj-ai / khoj

Your AI second brain. Get answers to your questions, whether they be online or in your own notes. Use online AI models (e.g gpt4) or private, local LLMs (e.g llama3). Self-host locally or use our cloud instance. Access from Obsidian, Emacs, Desktop app, Web or Whatsapp.
https://khoj.dev
GNU Affero General Public License v3.0
12.44k stars 633 forks source link

[FIX] sometimes llm_thread throw exception because pool connect to openai api server #894

Closed thinker007 closed 1 week ago

thinker007 commented 3 weeks ago

Describe the bug

A clear and concise description of what the bug is. Please include what you were expecting to happen vs. what actually happened.

To Reproduce

first httpx throw exception because pool connection to openai api server then client.chat.completions.create throw exception chat = client.chat.completions.create( stream=True, messages=formatted_messages, model=model_name, # type: ignore temperature=temperature, timeout=20, **(model_kwargs or dict()), ) finally: llm_thread not exit and causing the generator is not closed .

So the whole khoj server hangs

Screenshots

If applicable, add screenshots to help explain your problem.

Platform

If self-hosted

Additional context

Add any other context about the problem here.

debanjum commented 2 weeks ago

Just fixed the offline chat stream not being closed properly issue yesterday in https://github.com/khoj-ai/khoj/commit/5927ca803277901590466ac3deb621f5a0e67210.

Let me know if you're still seeing it in release >=1.21.3

thinker007 commented 2 weeks ago

Just fixed the offline chat stream not being closed properly issue yesterday in 5927ca8.

Let me know if you're still seeing it in release >=1.21.3

the same bug exists in https://github.com/khoj-ai/khoj/blob/master/src/khoj/processor/conversation/openai/utils.py

when throw exceptions whole application hangs

debanjum commented 2 weeks ago

Ah, didn't catch that. Thanks for pointing it out. Let me fix that up in a bit