mitmedialab / chat-tutor-embeddable

https://mitmedialab.github.io/chat-tutor-embeddable/
GNU General Public License v3.0
1 stars 0 forks source link

Visual indicator while waiting for response #5

Open pmalacho-mit opened 9 months ago

pmalacho-mit commented 9 months ago

It'd be nice to include some visual indicators for the user while they are waiting on a response.

Perhaps as soon as a message is sent, a message from "Assistant" should pop up, but with the animating dots in it's message (perhaps these are displayed when content is null or the empty string): https://nzbin.github.io/three-dots/

shengh318 commented 8 months ago

So currently story book doesn't have the AI responding back to us yet. How should we test this?

pmalacho-mit commented 8 months ago

Great thinking @shengh318 ! I think probably the best way to do that is to use some kind of library / add-on to storybook in order to mock fetch so that in the storybook "stories" we can effectively intercept the requests to the addToDB and ask endpoints.

This add on looks promising! https://storybook.js.org/addons/storybook-addon-mock Could you investigate it?

Looking through the documentation, it looks pretty straightforward -- the examples are written in/for react, but it looks like you'll just be adding a property to the parameters field of the meta object, like here:

https://github.com/mitmedialab/chat-tutor-embeddable/blob/main/src/stories/AddMessage.stories.ts#L8

shengh318 commented 8 months ago

Yep. I will see what this addon is about!

pmalacho-mit commented 8 months ago

Amazing, thanks @shengh318 !!

shengh318 commented 8 months ago

Questions.

  1. On response, what does the JSON look like from the API that is being returned from the fetch in askChatTutor?
  2. Is there a reason why we are sending all the conversations to the Chat bot in the askChatTutor function?

Oh also experimented around with the storybook-addon-mock and came across a bunch of problems and concerns.

  1. The mock package does not support CORS
  2. The mock package does not fully support Storybook v7
  3. No way to delay the response from the mock package so that we can see the 3 spinning dots when the Tutor is "thinking".
  4. There are barely any examples that actually shows how to use this package? @pmalacho-mit maybe you would have more luck than me haha.

Current Solution

I just created my own flask python local server and routing all the traffic to there instead. This way we can decide what content to send and how long to wait with the python time package before we sent the actual JSON response.

Let me know what you think!

pmalacho-mit commented 8 months ago

Gotcha! Let me play around with the addon today!