zylon-ai / private-gpt

Interact with your documents using the power of GPT, 100% privately, no data leaks
https://docs.privategpt.dev
Apache License 2.0
52.89k stars 7.12k forks source link

Is there an easy way to allow the Gradio UI to stretch to fill the size of the web browser instead of only being 40% total height? #1377

Closed Sionwage closed 6 months ago

Sionwage commented 7 months ago

I've been trying to figure out where in the privateGPT source the Gradio UI is defined to allow the last row for the two columns (Mode and the LLM Chat box) to stretch or grow to fill the entire webpage.

Any suggestions on where to look in the privateGPT code?

therohitdas commented 7 months ago

Gradio Code is defined here: https://github.com/imartinez/privateGPT/blob/a3ed14c58f77351dbd5f8f2d7868d1642a44f017/private_gpt/ui/ui.py#L174-L237

I wanted to increase the height too to fit the whole screen, I have asked how to do it in the Huggingface Discord: https://discord.com/channels/879548962464493619/1025174734427656283/threads/1182972600343859300

I tried code from here, but it did not help me: https://github.com/gradio-app/gradio/issues/4001

    def _build_ui_blocks(self) -> gr.Blocks:
        logger.debug("Creating the UI blocks")
        with gr.Blocks(
            title=UI_TAB_TITLE,
            theme=gr.themes.Soft(primary_hue=slate),
            css=".logo { "
            "display:flex;"
            "background-color: #C7BAFF;"
            "height: 80px;"
            "border-radius: 8px;"
            "align-content: center;"
            "justify-content: center;"
            "align-items: center;"
            "}"
            ".logo img { height: 25% }"
            ".contain { display: flex; flex-direction: column; }"
            "#component-0, #component-3, #component-8 { height: 100% !important; }"
            "#chatbot { flex-grow: 1; overflow: auto;}",
        ) as blocks:
            with gr.Row():
                gr.HTML(f"<div class='logo'/><img src={logo_svg} alt=PrivateGPT></div")

            with gr.Row():
                with gr.Column(scale=3, variant="compact"):
                    mode = gr.Radio(
                        ["Query Docs", "Search in Docs", "LLM Chat"],
                        label="Mode",
                        value="Query Docs",
                    )
                    upload_button = gr.components.UploadButton(
                        "Upload File(s)",
                        type="filepath",
                        file_count="multiple",
                        size="sm",
                    )
                    ingested_dataset = gr.List(
                        self._list_ingested_files,
                        headers=["File name"],
                        label="Ingested Files",
                        interactive=False,
                        render=False,  # Rendered under the button
                    )
                    upload_button.upload(
                        self._upload_file,
                        inputs=upload_button,
                        outputs=ingested_dataset,
                    )
                    ingested_dataset.change(
                        self._list_ingested_files,
                        outputs=ingested_dataset,
                    )
                    ingested_dataset.render()
                with gr.Column(scale=7):
                    _ = gr.ChatInterface(
                        self._chat,
                        chatbot=gr.Chatbot(
                            label=f"LLM: {settings().llm.mode}",
                            show_copy_button=True,
                            render=False,
                            elem_id="chatbot",
                            avatar_images=(
                                None,
                                AVATAR_BOT,
                            ),
                        ),
                        additional_inputs=[mode, upload_button],
                    )
        return blocks

I will add the answer here when I get an answer.

Sionwage commented 6 months ago

I figured it out!

on the line where it says chatbot=gr.Chatbot( you can add height=800 to set that box to be 800 pixels tall.

with gr.Column(scale=7):
    _ = gr.ChatInterface(
        self._chat,
        chatbot=gr.Chatbot(
            height=800,
            label=f"LLM: {settings().llm.mode}",
            show_copy_button=True,
            render=False,
            elem_id="chatbot",
            avatar_images=(
                None,
                AVATAR_BOT,
            ),
        ),
        additional_inputs=[mode, upload_button],
    )

Still trying to figure out how to allow this to be resized or set it as a percentage instead.

yvrjsharma commented 6 months ago

I made some small adjustments to your code (css part) @therohitdas so that the Gradio chatbot in the PrivateGPT layout would fill the entire screen. I responded on the Gradio Discord server, but I thought it would be more discoverable for the community in the future, so I'm reposting it here as well.

def chat(msg):
  return 'ok'

with gr.Blocks(
    title='test',
    css=".contain { display: flex !important; flex-direction: column !important; }"
    "#component-0, #component-3, #component-10, #component-8  { height: 100% !important; }"
    "#chatbot { flex-grow: 1 !important; overflow: auto !important;}"
    "#col { height: 100vh !important; }"
) as blocks:
    with gr.Row():
        gr.HTML(f"hello")

    with gr.Row(equal_height=False):
        with gr.Column(scale=3, ): 
            mode = gr.Radio(
                ["Query Docs", "Search in Docs", "LLM Chat"],
                label="Mode",
                value="Query Docs",
            )
            upload_button = gr.components.UploadButton(
                "Upload File(s)",
                type="filepath",
                file_count="multiple",
                size="sm",
            )
            ingested_dataset = gr.List(
                headers=["File name"],
                label="Ingested Files",
                interactive=False,
                render=False,  # Rendered under the button
            )
            upload_button.upload(
                inputs=upload_button,
                outputs=ingested_dataset,
            )
            ingested_dataset.change(
                outputs=ingested_dataset,
            )
            ingested_dataset.render()
        with gr.Column(scale=7, elem_id='col'):
            _ = gr.ChatInterface(
                chat,
                chatbot=gr.Chatbot(
                    show_copy_button=True,
                    render=False,
                    elem_id="chatbot",
                ),
                additional_inputs=[mode, upload_button],
            )

blocks.launch()
Sionwage commented 6 months ago

Much better than my kludge!

therohitdas commented 6 months ago

Thank you very much @yvrjsharma, I added your code and it works ✨ Also, thanks for being thoughtful and sharing the code here too. I made a minor change to dynamically calculate the best height of the col:

"#col { height: calc(100vh - 112px - 16px) !important; }"

The output is more responsive this way: Screenshot 2023-12-13 at 12 40 45 PM

codereyinish commented 1 month ago

Hi is this community still active? I have a question. I also created a CHATBOT > Here is my repo, problem is I have to reinstall transformer, gradio and pytorch again. I have shared my repo link . You can go through the problem in README.md file. Thank you so much https://github.com/codereyinish/AICHATBOT