mikebpech / turbogpt.ai

272 stars 70 forks source link

Implement streaming api. #8

Closed rmp135 closed 1 year ago

rmp135 commented 1 year ago

Had a go at implementing the streaming API. I'm not too familiar with React so you'll have to excuse anything that's out of place.

sendMessage is now an async generator function to allow for looping over the async message stream. When a message fragment is yielded it will be appended to the current assistant message. Stopping the generation will cancel this event stream.

The three dot loading indicator has been changed to only show during the initial load (between sending the request and the first message return).

The Submit button has been modified to allow for stopping the message generation when pressed, and the button text is now a prop. The "loading" state has been removed as it doesn't have a loading state anymore (although it was probably overkill to completely remove it).

The setTimeout in the TextBox has been reduced to 0ms. I believe this was added to wait for the setMessage dispatch event to complete before continuing. A single tick should do it, which vastly increases response time.

Also added some slight padding to the top and bottom of paragraphs for better spacing, with exceptions for the first and last. This does make lists quite spaced out so it might need some more work to better support different elements.

I've heard the streaming API can be unreliable with what it returns but I've had no issues testing it out.

mikebpech commented 1 year ago

Hey! This is awesome, thanks.

Few things I will change with this PR and then I will definitely merge it 🙏

  1. Right now it's re-rendering most of the components every time a new word appears due to the state changing.
  2. It should auto-scroll smoothly to the bottom when text is being added. Right now it freezes at the top 👀

Other than that it's awesome! Maybe we should add a checkbox to make it optional? 🤔

rmp135 commented 1 year ago
  1. It also saves to localstorage for each word which might have performance issues. Perhaps it should only save after completion or the user has stopped generation.
  2. I'm not getting freezing but I have noticed that it sticks the user to the bottom of the page (as in, you can't scroll back up to see previous messages). I think the page should move down when you click send but not prevent the user from scrolling back up as the message is being typed out.

You could make it optional but honestly I don't think anyone would want to turn it off. It's much more responsive and allows for cancelling if the prompt goes off the rails, saving on tokens.