Your AI second brain. Get answers to your questions, whether they be online or in your own notes. Use online AI models (e.g gpt4) or private, local LLMs (e.g llama3). Self-host locally or use our cloud instance. Access from Obsidian, Emacs, Desktop app, Web or Whatsapp.
Stream Status Messages via Streaming Response from server to web client
Overview
Use a single API controller for advanced chat streaming and non-streaming mode. Improve code maintainability and generalize to other Khoj client apps.
Update web client to use the streamed response to show train of thought, stream response and render references.
Motivation
### compiled references:
custom delimiterDetails
Major
Minor
### compiled references:
separator