Closed ajayvignesh01 closed 11 months ago
Instead of the server action streaming the response as it gets chunks from OpenAI, it streams it after completion. Video attached to demonstrate.
Code is an almost exact copy from the docs: https://sdk.vercel.ai/docs/api-reference/streaming-react-response
Only difference is this part in action.tsx:
// Respond with the stream return new experimental_StreamingReactResponse(stream, { ui({ content }) { return <div>{content}</div> } })
Also, can't seem to get experimental_StreamData working in server actions either. Will docs be added to show an example?
Will server actions support SSE in general, is that on the roadmap. Adding SSE support through useFormState would be pretty cool.
No response
nvm, looks like that how it's intended to work?
Description
Instead of the server action streaming the response as it gets chunks from OpenAI, it streams it after completion. Video attached to demonstrate.
Code is an almost exact copy from the docs: https://sdk.vercel.ai/docs/api-reference/streaming-react-response
Only difference is this part in action.tsx:
Also, can't seem to get experimental_StreamData working in server actions either. Will docs be added to show an example?
Will server actions support SSE in general, is that on the roadmap. Adding SSE support through useFormState would be pretty cool.
Code example
No response
Additional context
No response