gradio-app / gradio

Build and share delightful machine learning apps, all in Python. šŸŒŸ Star to support our work!
http://www.gradio.app
Apache License 2.0
32.43k stars 2.43k forks source link

Gradio State object does not work / reset when using `gradio app.py` to watch for changes #6214

Closed Rjdrenth closed 6 months ago

Rjdrenth commented 11 months ago

Describe the bug

When using a Gradio State object in an app, e.g. the hangman example, in conjunction with gradio hangman.py, the state object does not properly reset when a change is detected in any of the watched files and an auto-reload is triggered. When the state is used, it will result in a KeyError.

Hangman source: As found at https://www.gradio.app/guides/state-in-blocks

Auto reload method: As described in https://www.gradio.app/guides/developing-faster-with-reload-mode

Have you searched existing issues? šŸ”Ž

Reproduction

Add the following code from the hangman example to a file hangman.py:

import gradio as gr

secret_word = "gradio"

with gr.Blocks() as demo:    
    used_letters_var = gr.State([])
    with gr.Row() as row:
        with gr.Column():
            input_letter = gr.Textbox(label="Enter letter")
            btn = gr.Button("Guess Letter")
        with gr.Column():
            hangman = gr.Textbox(
                label="Hangman",
                value="_"*len(secret_word)
            )
            used_letters_box = gr.Textbox(label="Used Letters")

    def guess_letter(letter, used_letters):
        used_letters.append(letter)
        answer = "".join([
            (letter if letter in used_letters else "_")
            for letter in secret_word
        ])
        return {
            used_letters_var: used_letters,
            used_letters_box: ", ".join(used_letters),
            hangman: answer
        }
    btn.click(
        guess_letter, 
        [input_letter, used_letters_var],
        [used_letters_var, used_letters_box, hangman]
        )
demo.launch()

Then:

  1. run gradio hangman.py and open the app and type some letters.
  2. Things work fine
  3. Change anything in the code, e.g. "Guess Letter" -> "Take a guess"
  4. Try guessing a letter and observe the error

Screenshot

Irrelevant

Logs

`gradio hangman.py`
Watching: '<python_path>\Anaconda3\lib\site-packages\gradio', '<file_path>scripts\personal', '<file_path>scripts\personal'

Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
Changes detected in: <file_path>scripts\personal\hangman.py
Traceback (most recent call last):
  File "<python_path>\Anaconda3\lib\site-packages\gradio\queueing.py", line 427, in call_prediction
    output = await route_utils.call_process_api(
  File "<python_path>\Anaconda3\lib\site-packages\gradio\route_utils.py", line 232, in call_process_api
    output = await app.get_blocks().process_api(
  File "<python_path>\Anaconda3\lib\site-packages\gradio\blocks.py", line 1495, in process_api
    inputs = self.preprocess_data(fn_index, inputs, state)
  File "<python_path>\Anaconda3\lib\site-packages\gradio\blocks.py", line 1266, in preprocess_data
    processed_input.append(state[input_id])
  File "<python_path>\Anaconda3\lib\site-packages\gradio\state_holder.py", line 46, in __getitem__
    block = self.blocks.blocks[key]
KeyError: 23
Traceback (most recent call last):
  File "<python_path>\Anaconda3\lib\site-packages\gradio\queueing.py", line 427, in call_prediction
    output = await route_utils.call_process_api(
  File "<python_path>\Anaconda3\lib\site-packages\gradio\route_utils.py", line 232, in call_process_api
    output = await app.get_blocks().process_api(
  File "<python_path>\Anaconda3\lib\site-packages\gradio\blocks.py", line 1495, in process_api
    inputs = self.preprocess_data(fn_index, inputs, state)
  File "<python_path>\Anaconda3\lib\site-packages\gradio\blocks.py", line 1266, in preprocess_data
    processed_input.append(state[input_id])
  File "<python_path>\Anaconda3\lib\site-packages\gradio\state_holder.py", line 46, in __getitem__
    block = self.blocks.blocks[key]
KeyError: 23

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<python_path>\Anaconda3\lib\site-packages\gradio\queueing.py", line 472, in process_events
    response = await self.call_prediction(awake_events, batch)
  File "<python_path>\Anaconda3\lib\site-packages\gradio\queueing.py", line 436, in call_prediction
    raise Exception(str(error) if show_error else None) from error
Exception: 23
Traceback (most recent call last):
  File "<python_path>\Anaconda3\lib\site-packages\gradio\queueing.py", line 427, in call_prediction
    output = await route_utils.call_process_api(
  File "<python_path>\Anaconda3\lib\site-packages\gradio\route_utils.py", line 232, in call_process_api
    output = await app.get_blocks().process_api(
  File "<python_path>\Anaconda3\lib\site-packages\gradio\blocks.py", line 1495, in process_api
    inputs = self.preprocess_data(fn_index, inputs, state)
  File "<python_path>\Anaconda3\lib\site-packages\gradio\blocks.py", line 1266, in preprocess_data
    processed_input.append(state[input_id])
  File "<python_path>\Anaconda3\lib\site-packages\gradio\state_holder.py", line 46, in __getitem__
    block = self.blocks.blocks[key]
KeyError: 23
Traceback (most recent call last):
  File "<python_path>\Anaconda3\lib\site-packages\gradio\queueing.py", line 427, in call_prediction
    output = await route_utils.call_process_api(
  File "<python_path>\Anaconda3\lib\site-packages\gradio\route_utils.py", line 232, in call_process_api
    output = await app.get_blocks().process_api(
  File "<python_path>\Anaconda3\lib\site-packages\gradio\blocks.py", line 1495, in process_api
    inputs = self.preprocess_data(fn_index, inputs, state)
  File "<python_path>\Anaconda3\lib\site-packages\gradio\blocks.py", line 1266, in preprocess_data
    processed_input.append(state[input_id])
  File "<python_path>\Anaconda3\lib\site-packages\gradio\state_holder.py", line 46, in __getitem__
    block = self.blocks.blocks[key]
KeyError: 23

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<python_path>\Anaconda3\lib\site-packages\gradio\queueing.py", line 472, in process_events
    response = await self.call_prediction(awake_events, batch)
  File "<python_path>\Anaconda3\lib\site-packages\gradio\queueing.py", line 436, in call_prediction
    raise Exception(str(error) if show_error else None) from error
Exception: 23

System Info

gradio environment
Gradio Environment Information:
------------------------------
Operating System: Windows
gradio version: 4.0.2
gradio_client version: 0.7.0

------------------------------------------------
gradio dependencies in your environment:

aiofiles: 23.2.1
altair: 5.1.2
fastapi: 0.104.1
ffmpy: 0.3.1
gradio-client==0.7.0 is not installed.
httpx: 0.25.0
huggingface-hub: 0.18.0
importlib-resources: 6.1.0
jinja2: 3.1.2
markupsafe: 2.1.1
matplotlib: 3.7.0
numpy: 1.23.5
orjson: 3.9.10
packaging: 22.0
pandas: 1.5.3
pillow: 9.4.0
pydantic: 2.4.2
pydub: 0.25.1
python-multipart: 0.0.6
pyyaml: 6.0
requests: 2.28.1
semantic-version: 2.10.0
tomlkit==0.12.0 is not installed.
typer: 0.9.0
typing-extensions: 4.8.0
uvicorn: 0.23.2
websockets: 11.0.3
authlib; extra == 'oauth' is not installed.
itsdangerous; extra == 'oauth' is not installed.

gradio_client dependencies in your environment:

fsspec: 2023.10.0
httpx: 0.25.0
huggingface-hub: 0.18.0
packaging: 22.0
requests: 2.28.1
typing-extensions: 4.8.0
websockets: 11.0.3

Severity

I can work around it

davidADSP commented 10 months ago

Also seeing this bug - means that ChatInterface cannot be used with reload functionality currently. Can't see any workaround, other than not to use auto-reload.

abidlabs commented 8 months ago

Confirmed that this is still an issue in gradio==4.14.0

miguelwon commented 6 months ago

Any update on this? I'm also having this issue and because of that can't use auto-reload.

abidlabs commented 6 months ago

No update yet. We haven't had bandwidth to take a look at this yet, we'll try to fix when we can but it may take some time. cc @freddyaboulton for visibility

freddyaboulton commented 6 months ago

Sorry for the delay all! This will be fixed as part of #7684

Rjdrenth commented 6 months ago

Great, looking forward to it!