khoj-ai / khoj

Your AI second brain. Self-hostable. Get answers from the web or your docs. Build custom agents, schedule automations, do deep research. Turn any online or local LLM into your personal, autonomous AI (e.g gpt, claude, gemini, llama, qwen, mistral).
https://khoj.dev
GNU Affero General Public License v3.0
16.2k stars 796 forks source link

[FIX] sometimes llm_thread throw exception because pool connect to openai api server #894

Closed thinker007 closed 2 months ago

thinker007 commented 3 months ago

Describe the bug

A clear and concise description of what the bug is. Please include what you were expecting to happen vs. what actually happened.

To Reproduce

first httpx throw exception because pool connection to openai api server then client.chat.completions.create throw exception chat = client.chat.completions.create( stream=True, messages=formatted_messages, model=model_name, # type: ignore temperature=temperature, timeout=20, **(model_kwargs or dict()), ) finally: llm_thread not exit and causing the generator is not closed .

So the whole khoj server hangs

Screenshots

If applicable, add screenshots to help explain your problem.

Platform

If self-hosted

Additional context

Add any other context about the problem here.

debanjum commented 3 months ago

Just fixed the offline chat stream not being closed properly issue yesterday in https://github.com/khoj-ai/khoj/commit/5927ca803277901590466ac3deb621f5a0e67210.

Let me know if you're still seeing it in release >=1.21.3

thinker007 commented 3 months ago

Just fixed the offline chat stream not being closed properly issue yesterday in 5927ca8.

Let me know if you're still seeing it in release >=1.21.3

the same bug exists in https://github.com/khoj-ai/khoj/blob/master/src/khoj/processor/conversation/openai/utils.py

when throw exceptions whole application hangs

debanjum commented 3 months ago

Ah, didn't catch that. Thanks for pointing it out. Let me fix that up in a bit