Open yangml103 opened 6 months ago
@yangml103 Yes, that's the case right now. We wait for the response to "end". However, we have plans (No ETA just yet) to stream any response, just like it's done for SSE today.
Also, can you share more details about your use case? Any APIs in particular where you require this behavior to exist?
Sweet, looking forward to it! Yeah I'm just testing my own API locally, and the API uses streaming from OpenAI. It works fine when doing curl, but would be nice if I could just do all my testing on Postman instead of having to swap between it and the terminal.
Large Language Models' API from OpenAI and Ollama use JSON streaming. With the trend of AI chatbots these days, we can probably expect wide adoption of JSON streaming soon.
Is there an existing request for this feature?
Is your feature request related to a problem?
Postman doesn't support JSON streaming
Describe the solution you'd like
Be able to see the streamed data from JSON API instead of data after it has finished processing
Describe alternatives you've considered
No response
Additional context
Follow-up on issue #5040