Closed rmp135 closed 1 year ago
Hey! This is awesome, thanks.
Few things I will change with this PR and then I will definitely merge it 🙏
Other than that it's awesome! Maybe we should add a checkbox to make it optional? 🤔
You could make it optional but honestly I don't think anyone would want to turn it off. It's much more responsive and allows for cancelling if the prompt goes off the rails, saving on tokens.
Had a go at implementing the streaming API. I'm not too familiar with React so you'll have to excuse anything that's out of place.
sendMessage
is now an async generator function to allow for looping over the async message stream. When a message fragment is yielded it will be appended to the current assistant message. Stopping the generation will cancel this event stream.The three dot loading indicator has been changed to only show during the initial load (between sending the request and the first message return).
The Submit button has been modified to allow for stopping the message generation when pressed, and the button text is now a prop. The "loading" state has been removed as it doesn't have a loading state anymore (although it was probably overkill to completely remove it).
The setTimeout in the TextBox has been reduced to 0ms. I believe this was added to wait for the setMessage dispatch event to complete before continuing. A single tick should do it, which vastly increases response time.
Also added some slight padding to the top and bottom of paragraphs for better spacing, with exceptions for the first and last. This does make lists quite spaced out so it might need some more work to better support different elements.
I've heard the streaming API can be unreliable with what it returns but I've had no issues testing it out.