gradio-app / gradio

Build and share delightful machine learning apps, all in Python. 🌟 Star to support our work!
http://www.gradio.app
Apache License 2.0
30.34k stars 2.26k forks source link

Successful use of gradio SSE with private hf space, but inference causes error client side #8266

Closed Thomas2419 closed 1 month ago

Thomas2419 commented 1 month ago

Describe the bug

I have a standard huggingface private space built on the template text to image under gradio. I am able to successfully use the new SSE and open_stream() command very recently introduced. Yet when I go to do inference at my /infer endpoint my script returns the attatched error. The space appears to currently run inference at my request so the issue seems to be the client side error, no errors are thrown in the spaces logs.

Have you searched existing issues? 🔎

Reproduction


async function main() {
    try {
        // Connect to a specific Gradio app
        const app = await Client.connect("insert private hf", { hf_token: "insert hftoken" });

        // Define the parameters for the inference
        const params = {
            prompt: " ",
            negative_prompt: " ",
            seed: 0,
            randomize_seed: true,
            width: 256,
            height: 256,
            guidance_scale: 1,
            num_inference_steps: 2,
        };

                // Start the stream
        await app.open_stream();

        // Set up event handlers for the stream
        app.stream.onmessage = (event) => {
            const data = JSON.parse(event.data);
            console.log("Received data during stream:", data);
            // Check if the data includes the results
            if (data.results) {
                console.log("Inference result:", data.results);
                // Handle the results, such as extracting the image URL
            }
        };

        app.stream.onerror = (error) => {
            console.error("Stream encountered an error:", error);
        };

        // Send a prediction request
        await app.predict("/infer", params);
        console.log("Prediction request sent.");

    } catch (error) {
        console.error("Error during setup or prediction:", error);
    }
}

main();

Screenshot

No response

Logs

Unexpected error Connection errored out.
Error during setup or prediction: {
  type: 'status',
  stage: 'error',
  message: 'Connection errored out. ',
  queue: true,
  endpoint: '/infer',
  fn_index: 1,
  time: 2024-05-12T00:03:39.862Z
}

Hugginface space logs after running /infer endpoint:
0%| | 0/2 [00:00<?, ?it/s] 50%|█████ | 1/2 [00:07<00:07, 7.69s/it] 100%|██████████| 2/2 [00:14<00:00, 7.05s/it] 100%|██████████| 2/2 [00:14<00:00, 7.14s/it]

System Info

Using huggingface space with most up to date version of gradio, as well as most up to date version of gradio on local side client. Otherwise normal version of windows 11, and normal js using node to run .js script. 

Gradio hf space environment requirements.txt file: accelerate
diffusers
invisible_watermark
torch
transformers
xformers

Readme file contents: ---
title: art
emoji: 🖼
colorFrom: purple
colorTo: red
sdk: gradio
sdk_version: 4.31.0
app_file: app.py
pinned: false
---

Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference

Severity

Blocking usage of gradio

abidlabs commented 1 month ago

Hi @Thomas2419 does the issue you are facing only happen if the Space is private?

Thomas2419 commented 1 month ago

Hello @abidlabs ! Thank you for the response, yes in my testing when making the space Public infer worked just fine. It was only once transition to private and running infer that the error occurs. Otherwise it successfully returns then images and all.

hannahblair commented 1 month ago

@Thomas2419 thanks for letting us know! I'll take a look into this.

Thomas2419 commented 1 month ago

@hannahblair In case this assists, I wanted to include a quick update, I was told my usage wasn't quite right and ran some updated tests that should hopefully reflect the intended usage.


 async function testStream() {
    try {
        const app = await Client.connect("space", { hf_token: "tokenW" });

        // Define parameters
        const prompt = " "; 
        const negativePrompt = ""; 
        const seed = Math.floor(Math.random() * 100);
        const randomizeSeed = true;
        const width = 256;  // 
        const height = 256;  //
        const guidanceScale = 1;  // 
        const numInferenceSteps = 2;  //

        // Submit the request and handle responses
        const submission = app.submit("/infer", [
            prompt,
            negativePrompt,
            seed,
            randomizeSeed,
            width,
            height,
            guidanceScale,
            numInferenceSteps
        ]).on("data", (data) => {
            console.log("Received data:", data);
        }).on("status", (status) => {
            console.log("Current status:", status);
        });

    } catch (error) {
        console.error('Error during stream setup:', error);
    }
}

// Execute the function
testStream().catch(console.error);

And then output of:

Current status: {
  type: 'status',
  stage: 'pending',
  queue: true,
  endpoint: '/infer',
  fn_index: 1,
  time: 2024-05-13T16:12:38.119Z
}
Unexpected error Connection errored out.
Current status: {
  type: 'status',
  stage: 'error',
  message: 'Connection errored out. ',
  queue: true,
  endpoint: '/infer',
  fn_index: 1,
  time: 2024-05-13T16:12:38.456Z
}

Swap space to public output of:


Current status: {
  type: 'status',
  stage: 'pending',
  queue: true,
  endpoint: '/infer',
  fn_index: 1,
  time: 2024-05-13T16:16:34.528Z
}
Current status: {
  type: 'status',
  endpoint: '/infer',
  fn_index: 1,
  time: 2024-05-13T16:16:34.799Z,
  queue: true,
  stage: 'pending',
  code: undefined,
  size: 1,
  position: 0,
  eta: 48.70225167274475,
  success: undefined
}
Current status: {
  type: 'status',
  endpoint: '/infer',
  fn_index: 1,
  time: 2024-05-13T16:16:34.800Z,
  queue: true,
  stage: 'pending',
  code: undefined,
  size: undefined,
  position: 0,
  success: undefined,
  eta: 48.70225167274475
}
Received data: {
  type: 'data',
  time: 2024-05-13T16:16:53.801Z,
  data: [
    {
      path: '/tmp/gradio/dcd8a6847a30ba2167f57ac6a0a0a5b4a61a12ec/image.webp',
      url: '{Removed for privacy}/file=/tmp/gradio/dcd8a6847a30ba2167f57ac6a0a0a5b4a61a12ec/image.webp',
      size: null,
      orig_name: 'image.webp',
      mime_type: null,
      is_stream: false,
      meta: [Object]
    }
  ],
  endpoint: '/infer',
  fn_index: 1
}
Current status: {
  type: 'status',
  time: 2024-05-13T16:16:53.806Z,
  queue: true,
  message: undefined,
  stage: 'complete',
  code: undefined,
  progress_data: undefined,
  endpoint: '/infer',
  fn_index: 1
}
Thomas2419 commented 1 month ago

This has been fixed in the most recent gradio release @gradio/client@0.19.3 for Javascript and the overall gradio 4.31.2 release. Thanks to @pngwn for fixing it up!