gradio-app / gradio

Build and share delightful machine learning apps, all in Python. 🌟 Star to support our work!
http://www.gradio.app
Apache License 2.0
32.47k stars 2.43k forks source link

ChatInterface rendered inside a Block appears very small in size #7714

Open yvrjsharma opened 6 months ago

yvrjsharma commented 6 months ago

Describe the bug

When the ChatInterface is integrated within a Gradio block, the resulting chat window appears to be significantly smaller in size than expected. This can make the chat interface difficult to use and may cause inconvenience to the user.

Have you searched existing issues? 🔎

Reproduction

While rendering ChatInterface inside a block:

import gradio as gr
def chat(msg, hist):
    return "ok"

ci = gr.ChatInterface(chat)
with gr.Blocks() as demo:
    ci.render()
demo.launch()

Implementing ChatInterface directly:

import gradio as gr
def chat(msg, hist):
    return "ok"

ci = gr.ChatInterface(chat)
ci.launch()

Screenshot

While rendering ChatInterface inside a block:

image

Implementing ChatInterface directly:

image

Logs

No response

System Info

Testing on Colab

Severity

I can work around it

dawoodkhan82 commented 5 months ago

@yvrjsharma I think this is expected.

To fix you can set fill_height=True

import gradio as gr
def chat(msg, hist):
    return "ok"

ci = gr.ChatInterface(chat)
with gr.Blocks(fill_height=True) as demo:
    ci.render()
demo.launch()
yvrjsharma commented 5 months ago

Ah, this is perfect @dawoodkhan82 ! Thanks for sharing the solution.

yvrjsharma commented 5 months ago

Actually, I am still experiencing issues with the chat window size while customizing a ChatInterface using the chatbot parameter.

import gradio as gr
def chat(msg, hist):
    return "ok"

chatbot= gr.Chatbot(placeholder='''<img src="https://raw.githubusercontent.com/gradio-app/gradio/main/readme_files/gradio.svg" style="width:30%; opacity:0.5;">''')
ci = gr.ChatInterface(chat, chatbot=chatbot)
with gr.Blocks(fill_height=True) as demo:
    ci.render()
demo.launch()

image

yvrjsharma commented 5 months ago

Hi @dawoodkhan82 I found that fill_height=True only works for gr.ChatInterface. When using gr.Chatbot alone or to customize the ChatInterface, the parameter stops working.

Repro:

multimodal chatbot

def bot_response(history, message): for x in message["files"]: history.append([(x,), None]) if message["text"] is not None: history.append([message["text"], None]) response = "That's cool!" history[-1][1] = "" for character in response: history[-1][1] += character time.sleep(0.05) yield history

with gr.Blocks(fill_height=True) as demo: chatbot = gr.Chatbot( [], elem_id="chatbot", bubble_full_width=False, )

chat_input = gr.MultimodalTextbox(interactive=True, file_types=["image"], placeholder="Enter message or upload file...", show_label=False)
chat_input.submit(bot_response, [chatbot, chat_input], [chatbot,])

demo.queue() if name == "main": demo.launch(debug=False)

Result:
![image](https://github.com/gradio-app/gradio/assets/48665385/e4f29c50-773c-4165-a601-25b26adf425f)

- Simple Chatbot with gr.Chatbot (doesn't work) : 
```python
import gradio as gr
import random

with gr.Blocks(fill_height=True) as demo:
    chatbot = gr.Chatbot()
    msg = gr.Textbox()
    clear = gr.ClearButton([msg, chatbot])

    def respond(message, chat_history):
        bot_message = random.choice(["How are you?", "I love you", "I'm very hungry"])
        chat_history.append((message, bot_message))
        return "", chat_history

    msg.submit(respond, [msg, chatbot], [msg, chatbot])

if __name__ == "__main__":
    demo.launch()   

Result: image

chatbot = gr.Chatbot() with gr.Blocks(fill_height=True) as demo: ci = gr.ChatInterface(chat, chatbot=chatbot) demo.launch()

Results: 
![image](https://github.com/gradio-app/gradio/assets/48665385/6edc515f-8fb4-4ab4-9507-9db9c034b69b)

- ChatInterface (works):
```python
import gradio as gr
def chat(msg, hist):
    return "ok"

ci = gr.ChatInterface(chat)
with gr.Blocks(fill_height=True) as demo:
    ci.render()
demo.launch()

Results: image

dawoodkhan82 commented 5 months ago

@yvrjsharma You can add scale=1 to the chatbot instance in addition to fill_height=True in blocks. That should fix the height issue. Although maybe setting fill_height=True should automatically set the scale (like we do in chat_interface). @pngwn Thoughts?

yvrjsharma commented 5 months ago

Sweet, thanks @dawoodkhan82! I was able to expand all the chatbot test-cases (from above) to full screen height by setting up scale=1 inside the chatbot instances. I think we should update our Chatbot documentation to make this clearer, or we could simply use fill_height=True to handle this automatically, which would be a cleaner solution.

WH-Yoshi commented 1 month ago

Hi, I'm facing this issue too after finding out I'm not applicable for @dawoodkhan82 idea. Here's my ChatInterface : FromThis What I want to create is a gr.Row() with the TextBox and the ChatInterface as you can see, but the parameter scale=1 doesn't exist in gr.Row(). So the Chatbot can't expand to the bottom. I tried all the fill_height=True possible but none will work. Here's my code :

if __name__ == '__main__':
    CSS = """"""

    with gr.Blocks(fill_height=True) as demo:
        with gr.Row(equal_height=False):
            yourself = gr.Textbox(
                scale=2,
            )
            with gr.Column(scale=5):
                gr.ChatInterface(
                    fn=predict,
                    fill_height=True,
                )

    demo.launch()

[EDIT] The workaround to this is to play with CSS.

After searching a bit, I found out that equal_height of gr.Row() will add align-items: flex-start; and this is why the chatbot won't expand anymore.

I found my workaround by changing equal_height or by adding some CSS for my case specifically:

if __name__ == '__main__':
    CSS = """#row1 {flex-grow: 1; align-items: unset;}
    .form {height: fit-content;}"""  # Applied to the box containing the textbox on the left

    with gr.Blocks(fill_height=True, css=CSS) as demo:
        with gr.Row(equal_height=False, elem_id="row1"):  # Putting equal_height=True will work
            yourself = gr.Textbox(
                scale=2,
            )
            with gr.Column(scale=5):
                gr.ChatInterface(
                    fn=predict,
                    fill_height=True,
                )

    demo.launch()

This maybe can help for the cleaner solutions.