gradio-app / gradio

Build and share delightful machine learning apps, all in Python. 🌟 Star to support our work!
http://www.gradio.app
Apache License 2.0
30.37k stars 2.26k forks source link

Add ability add a single message from the bot/user side #3165

Closed dawoodkhan82 closed 1 year ago

dawoodkhan82 commented 1 year ago

Description

Add ability add a single message from the bot/user side. Plus add ability to specify if the chat starts with bot or user. gr.Chatbot([("Hi, I'm DialoGPT. Try asking me a question.", None)], starts_with="bot")

Please include:

Closes: #3092

Checklist:

A note about the CHANGELOG

Hello 👋 and thank you for contributing to Gradio!

All pull requests must update the change log located in CHANGELOG.md, unless the pull request is labeled with the "no-changelog-update" label.

Please add a brief summary of the change to the Upcoming Release > Full Changelog section of the CHANGELOG.md file and include a link to the PR (formatted in markdown) and a link to your github profile (if you like). For example, "* Added a cool new feature by [@myusername](link-to-your-github-profile) in [PR 11111](https://github.com/gradio-app/gradio/pull/11111)".

If you would like to elaborate on your change further, feel free to include a longer explanation in the other sections. If you would like an image/gif/video showcasing your feature, it may be best to edit the CHANGELOG file using the GitHub web UI since that lets you upload files directly via drag-and-drop.

gradio-pr-bot commented 1 year ago

🎉 The demo notebooks match the run.py files! 🎉

gradio-pr-bot commented 1 year ago

All the demos for this PR have been deployed at https://huggingface.co/spaces/gradio-pr-deploys/pr-3165-all-demos

abidlabs commented 1 year ago

Nice @dawoodkhan82! I noticed a few issues when I was testing with demo/chatbot_multimodal:

1) If you set starts_with="bot", the top of the first message is cut off by the label:

image

1b) Also it's interesting the color of the messages has also switched. In other words, user messages are now gray and bot messages are now orange. I don't think the colors should switch?

2) If you have an initial message (using None as one of the entries of the tuple), then it disappears after the next messages show up

Recording 2023-02-09 at 17 07 08

3) If you return a tuple like ("message", None) from your function, then interestingly you see null:

image

3b) For some reason, the markdown / image postprocessing isn't working here^ but it's also broken on main -- do you experience this as well with demo/chatbot_multimodal?

abidlabs commented 1 year ago

(3) makes me think it might not be possible to use None as the symbol to indicate that a chatbot message doesn't show up. Perhaps "" or any falsey string should be what users use to indicate that a chatbot message shouldn't appear. I can't think of any use case where you actually want to show an empty message like that.

aliabid94 commented 1 year ago

Hmm I've never really liked the data structure of chatbot because it was so restrictive - simply assuming one chat between a user and bot back and forth, starting with a user. I think we should be considering making the underlying data structure much more flexible, rather than adding these args. Would love a data structure such as:

[
  ["user", "what color is the sky"],
  ["bot", "the sky is blue"],
]

This would make many things possible - setting which message comes first, allowing multiple messages from one party in a row, or even allowing multiple parties in a chat.

Could be set via gr.Chatbot(type="groupchat") vs gr.Chatbot(type="response_pairs") (can't think of better names for now).

If we don't want to change the data structure for now, I'm not sure we need a "starts_with". Why can't we just have the first message be [None, "I am a bot"]? None seems like the natural way to skip a message, disagree with @abidlabs that we should use an empty string instead.

I think we can just remove the BlockLabel for Chatbot, or make it non-floating, like we do with DataFrame so it doesn't block the chat view (@pngwn might have thoughts on that).

dawoodkhan82 commented 1 year ago

@aliabid94 I considered changing the underlying data structure completely. It would be a breaking change to all chatbot demos tho (which there is a lot of now). I like the structure of a list of tuples, where the first element is "user" or "bot", etc. and second element is the message.

When passing [None, "I am a bot"] the frontend received ["I am a bot"] a single element tuple. Since I believe we remove None types.

aliabid94 commented 1 year ago

Shouldn’t be a breaking change if we use types to keep the formats separate and support both. Where are the None’s removed from the output? Just remove that logic.

abidlabs commented 1 year ago

@aliabid94 the problem is that we always remove any None's from the config dictionary or any update before we pass it into the frontend -- see the gr.utils.delete_none() method. I believe we do this because None / null behaves weird in JS (@pngwn might remember more details about this). I don't think we should carve out an exception for Chatbot specifically.

However, I have an idea @dawoodkhan82 -- which is that in the postprocessing method of the Chatbot, we can programmatically replace any entry of None with a special enum such as gr.components._Keywords.NO_VALUE and then in the frontend, we treat this special value as null.

That should allow us to handle these kinds of cases without having to change the data structure. We could even handle the multiple messages from one user in a row case that @aliabid94 is talking about by having successive messages in which the "bot entry" is None.

aliabid94 commented 1 year ago

I don't understand this utils.delete_none. None gets converted to null in javascript. We need to preserve the Nones. So right now, if a user returns this dict to a gr.JSON component

{"a": 1, "b": None}

the "b" key will be removed? This is incorrect behaviour, the null is part of the JSON output.

I thought the original intention of utils.delete_none was when an update() was called, a value with None meant no update. But this recursive removal of None on all output doesn't make sense to me.

pngwn commented 1 year ago

We can just add some padding to the chatbot to get rid of the label obstruction, that is what we do with other components. Check HighlightedText or JSON for an example.

I don't like the starts_with="bot" kwarg really. I agree with @aliabid94 that the data structure should be enough to control that. I actually don't love the idea of adding kwargs just so we can change the data structure either. I think that the data structure should be:

[
  {speaker: bot, message: "blah"},
  {speaker: user, message: "blah"},
  {speaker: user, message: "blah"},
  {speaker: bot, message: "blah"},
]

But I don't think it is essential we change it now. If None is being removed from inside of a data structure then that is a bug. Only top level 'props' need their None properties removing, that should never happen to the data inside a structure held in a prop, that is potentially very problematic. for example:

{
  "props": {
    "prop1": None -> should be removed, not converted to null
    "prop2": [["hi", None], [None, "hi"]] -> [["hi", null], [null, "hi"]]
  }
}
dawoodkhan82 commented 1 year ago

I think for now I'll implement @abidlabs approach. Since it won't require changing the data structure. We can look into make the underlying data structure into a list of dicts if it makes sense in the future. The only limitation I can think of for the current data structure is, it doesn't allow for multiple "users". Which we haven't heard the need for yet. I didn't realize the removing of None in this case was a bug. Will remove the unnecessary starts_with kwarg.

pngwn commented 1 year ago

If we just fix the bug (#3183), doesn't mean we can simply pass None and change the frontend?

abidlabs commented 1 year ago

I think so @pngwn. Let me get a fix in for #3183

dawoodkhan82 commented 1 year ago

If you have an initial message (using None as one of the entries of the tuple), then it disappears after the next messages show up

@abidlabs This is because the initial message is never saved into the history state. Would it be fine to expect the developer to pass in the same default value into the state object? Or do you have a better suggestion.

abidlabs commented 1 year ago

@abidlabs This is because the initial message is never saved into the history state. Would it be fine to expect the developer to pass in the same default value into the state object? Or do you have a better suggestion.

Ah you're right, that seems like a perfectly reasonable thing to do. Thanks @dawoodkhan82 and happy to re-review whenever the PR is ready

dawoodkhan82 commented 1 year ago

3b) For some reason, the markdown / image postprocessing isn't working here^ but it's also broken on main -- do you experience this as well with demo/chatbot_multimodal?

works for me. When running on the built version of the frontend.

abidlabs commented 1 year ago

Let me retest

abidlabs commented 1 year ago

Hmm @dawoodkhan82 it's not working for me locally or on the deployed PR Space from this branch:

image
dawoodkhan82 commented 1 year ago

Hmm @dawoodkhan82 it's not working for me locally or on the deployed PR Space from this branch:

image

Ok I'll take a look at this as a separate issue. Seems like the chatbot just can't locate the tmp file. But also happening on main right?

abidlabs commented 1 year ago

But also happening on main right?

Yep

abidlabs commented 1 year ago

Looks really good @dawoodkhan82! One small issue I noticed was that if a message is None, then no message bubble appears but there is a blank space where the bubble would have appeared.

See this:

image

Or this:

image

I don't think any blank space should appear so that the effect is that of getting multiple messages one after the other

dawoodkhan82 commented 1 year ago

Oh I see, yeah good catch. Ill fix this

On Thu, Feb 16, 2023 at 12:27 AM Abubakar Abid @.***> wrote:

Looks really good @dawoodkhan82 https://github.com/dawoodkhan82! One small issue I noticed was that if a message is None, then no message bubble appears but there is a blank space where the bubble would have appeared.

See this:

[image: image] https://user-images.githubusercontent.com/1778297/219276375-d602ba82-1bf4-4782-83e4-7e9143759893.png

Or this:

[image: image] https://user-images.githubusercontent.com/1778297/219276427-4779715b-0ba5-457d-813e-b701d0ac4c5f.png

I don't think any blank space should appear so that the effect is that of getting multiple messages one after the other

— Reply to this email directly, view it on GitHub https://github.com/gradio-app/gradio/pull/3165#issuecomment-1432533365, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADBCYLFDP6NUZYJD6MPJIN3WXW3FHANCNFSM6AAAAAAUXEWHA4 . You are receiving this because you were mentioned.Message ID: @.***>

abidlabs commented 1 year ago

Side note: the image issue seems to have been fixed!

dawoodkhan82 commented 1 year ago

Huh how? Was it fixed by another pr

dawoodkhan82 commented 1 year ago

@abidlabs Fixed the display issue