This PR adds the EventSource protocol for streaming LLM output. We still respect the text and JSON streaming modes but all events will now stream as follows:
data: <llm_token_1>
data: <llm_token_2>
This also allows the possibility of implementing named events for special use cases:
Description
Fixes #121 Fixes #112
This PR adds the
EventSource
protocol for streaming LLM output. We still respect the text and JSON streaming modes but all events will now stream as follows:This also allows the possibility of implementing named events for special use cases:
Finally, the StreamingResponse also improves error handling by sending a 500 error event when chain execution fails (Fixes #146)
Changelog:
sse-starlette
dependencyStreamingResponse
class and base callback handlers