gradio-app / gradio

Build and share delightful machine learning apps, all in Python. 🌟 Star to support our work!
http://www.gradio.app
Apache License 2.0
34.31k stars 2.61k forks source link

problem with streaming a string which has latex code #5355

Closed redrocket8 closed 1 year ago

redrocket8 commented 1 year ago

Describe the bug

When I try to run gradio/chatbot_multimodal then when a string has latex code then the latex rendered equation is not displayed, instead the code is streamed as the rest of the characters.

example string = The square root of 4 is 2 $\sqrt{4}=2 $

I have set

chatbot = gr.Chatbot(
        latex_delimiters=[{ "left": "$", "right": "$", "display": False }],
        elem_id="chatbot",
        avatar_images=(None, (os.path.join(os.path.abspath(''), "avatar.png"))),
    )

I have also tried example string = The square root of 4 is 2 $\\sqrt{4}=2 $ with no results

Have you searched existing issues? 🔎

Reproduction

!pip install -q gradio 
import os
!wget -q https://github.com/gradio-app/gradio/raw/main/demo/chatbot_multimodal/avatar.png

import gradio as gr
import os
import time

# Chatbot demo with multimodal input (text, markdown, LaTeX, code blocks, image, audio, & video). Plus shows support for streaming text.

def add_text(history, text):
    history = history + [(text, None)]
    return history, gr.update(value="", interactive=False)

def add_file(history, file):
    history = history + [((file.name,), None)]
    return history

def bot(history):
    response = "**The** square root of 4 is 2 $\\\\sqrt{4}=2$"
    history[-1][1] = ""
    for character in response:
        history[-1][1] += character
        time.sleep(0.05)
        yield history
    history[-1][1]=""
    return history

with gr.Blocks() as demo:
    chatbot = gr.Chatbot(
        latex_delimiters=[{ "left": "$", "right": "$", "display": False }],
        elem_id="chatbot",
        avatar_images=(None, (os.path.join(os.path.abspath(''), "avatar.png"))),
    )

    with gr.Row():
        txt = gr.Textbox(
            scale=4,
            show_label=False,
            placeholder="Enter text and press enter, or upload an image",
            container=False,
        )
        btn = gr.UploadButton("📁", file_types=["image", "video", "audio"])

    txt_msg = txt.submit(add_text, [chatbot, txt], [chatbot, txt], queue=False).then(
        bot, chatbot, chatbot
    )
    txt_msg.then(lambda: gr.update(interactive=True), None, [txt], queue=False)
    file_msg = btn.upload(add_file, [chatbot, btn], [chatbot], queue=False).then(
        bot, chatbot, chatbot
    )

demo.queue()
if __name__ == "__main__":
    demo.launch()

Screenshot

No response

Logs

no Logs

System Info

Gradio Environment Information:
------------------------------
Operating System: Linux
gradio version: 3.41.2
gradio_client version: 0.5.0

------------------------------------------------
gradio dependencies in your environment:

aiofiles: 23.2.1
altair: 4.2.2
fastapi: 0.103.0
ffmpy: 0.3.1
gradio-client==0.5.0 is not installed.
httpx: 0.24.1
huggingface-hub: 0.16.4
importlib-resources: 6.0.1
jinja2: 3.1.2
markupsafe: 2.1.3
matplotlib: 3.7.1
numpy: 1.23.5
orjson: 3.9.5
packaging: 23.1
pandas: 1.5.3
pillow: 9.4.0
pydantic: 2.2.1
pydub: 0.25.1
python-multipart: 0.0.6
pyyaml: 6.0.1
requests: 2.31.0
semantic-version: 2.10.0
typing-extensions: 4.7.1
uvicorn: 0.23.2
websockets: 11.0.3
authlib; extra == 'oauth' is not installed.
itsdangerous; extra == 'oauth' is not installed.

gradio_client dependencies in your environment:

fsspec: 2023.6.0
httpx: 0.24.1
huggingface-hub: 0.16.4
packaging: 23.1
requests: 2.31.0
typing-extensions: 4.7.1
websockets: 11.0.3

Severity

I can work around it

abidlabs commented 1 year ago

Can confirm the issue! We'll look into it @redrocket8.

Here's a repro showing that the latex is working in the fixed case but not in the streaming case:

import gradio as gr

string = "did you know $1+1=2$?"

def latex():
    for i in range(len(string)):
        yield [("gimme a fact", string[:i+1])]

with gr.Blocks() as demo:
    chatbot = gr.Chatbot([("fixed", string)], latex_delimiters=[{ "left": "$", "right": "$", "display": False }])
    chatbot = gr.Chatbot(latex_delimiters=[{ "left": "$", "right": "$", "display": False }])
    demo.load(latex, None, chatbot)

demo.queue().launch()
redrocket8 commented 1 year ago

Thanks

abidlabs commented 1 year ago

Similar problem happens with gr.Markdown() but seems even more severe -- any time Markdown is returned from a function (streaming or not), LaTeX is not rendered. See:

import gradio as gr
import time

string = "did you know $1+1=2$?"

def fixed():
    return string

with gr.Blocks() as demo:
    chatbot1 = gr.Markdown(string, latex_delimiters=[{ "left": "$", "right": "$", "display": False }])
    chatbot2 = gr.Markdown(latex_delimiters=[{ "left": "$", "right": "$", "display": False }])
    demo.load(fixed, None, chatbot2)

demo.queue().launch()
redrocket8 commented 1 year ago

Although it is closed. Can't get it working with streaming . Can you give an example to run on colab?

sbarman25 commented 1 year ago

Can you please push this change if it's fixed? An example that it's working would be great too.

abidlabs commented 1 year ago

We'll do a release soon -- in the meantime, you can install directly from the PR branch by doing:

pip install https://gradio-builds.s3.amazonaws.com/be42a5ded17e725f3227f70fc28877c1dba76780/gradio-3.41.2-py3-none-any.whl

Please see my messages above for repros you can try to confirm that the issue has been fixed.