Closed denniszielke closed 6 months ago
x
- [ ] bug report -> please search issues before submitting - [x] feature request - [ ] documentation issue or request - [ ] regression (a behavior that used to work and stopped in a new release)
I want response streaming which is using ServerSide Events to ensure that long running LLM prompts return quickly. https://cookbook.openai.com/examples/how_to_stream_completions
Windows 7, 8 or 10. Linux (which distribution). macOS (Yosemite? El Capitan? Sierra?)
Thanks! We'll be in touch soon.
This solution does support streaming!
This issue is for a: (mark with an
x
)Minimal steps to reproduce
I want response streaming which is using ServerSide Events to ensure that long running LLM prompts return quickly. https://cookbook.openai.com/examples/how_to_stream_completions
Any log messages given by the failure
Expected/desired behavior
OS and Version?
Versions
Mention any other details that might be useful