ZhengPeng7 / BiRefNet

[CAAI AIR'24] Bilateral Reference for High-Resolution Dichotomous Image Segmentation
https://www.birefnet.top
MIT License
1.23k stars 96 forks source link

Error when use .onnx on web #78

Closed tidus2102 closed 1 month ago

tidus2102 commented 2 months ago

Hi, I tried to test .onnx on web (CPU) using onnx-community/BiRefNet_T. Here is index.html:

<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="UTF-8" />
  <title>Background Removal</title>
</head>
<body>
<script type="module">
  import { AutoModel, AutoProcessor, RawImage } from 'https://cdn.jsdelivr.net/npm/@huggingface/transformers';

  // Load model and processor
  const model_id = 'onnx-community/BiRefNet_T';
  const model = await AutoModel.from_pretrained(model_id, { dtype: 'fp32' });
  const processor = await AutoProcessor.from_pretrained(model_id);

  // Load image from URL
  const url = 'https://images.pexels.com/photos/5965592/pexels-photo-5965592.jpeg?auto=compress&cs=tinysrgb&w=1024';
  const image = await RawImage.fromURL(url);

  // Pre-process image
  const { pixel_values } = await processor(image);

  // Predict alpha matte
  const { output_image } = await model({ input_image: pixel_values });
  console.log(output_image);
</script>
</body>
</html>

Check Chrome dev tools > Application tab > Cache storage, the .onnx file is loaded & cached without problem: Screenshot 2024-08-28 at 20 07 41

But I got error at predict code line:

const { output_image } = await model({ input_image: pixel_values })

hiLBrmNL_G8aENlwRHGGQ

I try the other input images, got the same issue. Please help to check it. Thank you.

ZhengPeng7 commented 2 months ago

Thanks for letting me know about this problem. I'm not familiar with the web app, but I will look into it tomorrow and reply to you then.

ZhengPeng7 commented 2 months ago

Have you tried following the example codes in my HF space demo? You can click the Use via API at the bottom of the page and follow the instructions there (with an uploaded image or a URL to the online image):

截屏2024-08-29 14 06 01

截屏2024-08-29 14 07 55

tidus2102 commented 2 months ago

Thanks for your reply. I already tried your demo spaces and it works. I only ask about issue when using .onnx file on js (using transformers.js). I haven't tried BiRefNet .onnx on python but saw you and someone have tried it here.

About .onnx on js, I tested the briaai/RMBG-1.4, it works without any issues. So I'm not sure if BiRefNet .onnx file got some issues. I haven't tried create .onnx file again from pytorch file on my local (I'm using Macbook M1).

ZhengPeng7 commented 2 months ago

Please use the official ONNX files (you can download them from the GitHub release or GDrive folder given in README). I'm not saying their ONNX files are wrong, but I don't know whether there is any problem.

ZhengPeng7 commented 2 months ago

Hi, Hung, I tried to learn about this problem. I also saw that you raised a discussion in the ONNX-community's HF repo. I really appreciate it, but I'm really not good at (even a noob) web deployment (things like transformers.js). Very sorry for that. If I'm not so busy with other work in recent days, I'll try to learn and solve it step by step.

Could you send me your demo projects, which I could build from zero, so I can test the error you encounter? Many thanks! If I solve it by myself, you'll get notified here. Finally, let me add a need-help badge to this issue to see if any other guy is good at it and do some help.

tidus2102 commented 2 months ago

hi, sorry for the late reply, I tried to find out more about this issue after got your reply.

First, the .onnx file on onnx-community/BiRefNet_T is the same with your .onnx on github releases.

Second, here is my demo code: https://codesandbox.io/p/sandbox/birefnet-onnx-5c7vv8

NOTE:

Demo run: Screenshot 2024-09-01 at 14 08 48 Screenshot 2024-09-01 at 14 08 03

Check Chrome dev tools > Console tab to see the model config (both models) after model loaded: onnx-community/BiRefNet_T .onnx config:

{
    "model_type": "swin",
    "is_encoder_decoder": false,
    "normalized_config": {
        "model_type": "swin",
        "is_encoder_decoder": false
    }
}

briaai/RMBG-1.4 .onnx config:

{
    "model_type": "SegformerForSemanticSegmentation",
    "is_encoder_decoder": false,
    "_name_or_path": "briaai/RMBG-1.4",
    "architectures": [
        "BriaRMBG"
    ],
    "auto_map": {
        "AutoConfig": "MyConfig.RMBGConfig",
        "AutoModelForImageSegmentation": "briarmbg.BriaRMBG"
    },
    "custom_pipelines": {
        "image-segmentation": {
            "impl": "MyPipe.RMBGPipe",
            "pt": [
                "AutoModelForImageSegmentation"
            ],
            "tf": [],
            "type": "image"
        }
    },
    "in_ch": 3,
    "out_ch": 1,
    "torch_dtype": "float32",
    "transformers_version": "4.38.0.dev0",
    "normalized_config": {
        "model_type": "SegformerForSemanticSegmentation",
        "is_encoder_decoder": false
    }
}

Error when run predict with BiRefNet .onnx Screenshot 2024-09-01 at 14 04 03

Check Chrome dev tools > Application tab > Cache storage > transformers-cache for .onnx model files load from huggingface: Screenshot 2024-09-01 at 13 54 46

Third:

Summary: the BiRefNet .onnx only got error with js, I see significant differences when log 2 models config. Hope this helps. Thank you.

xenova commented 1 month ago

Hi there 👋 The model (with the example code) runs fine in Node.js, so if I had to guess, it is an out of memory issue due to browser resource constraints. I think there are two fixes we can consider:

  1. Export at a lower input shape (instead of 1024x1024)
  2. Export with dynamic shapes
ZhengPeng7 commented 1 month ago

Thanks for your kind explanation, @xenova ! BTW, the default and best resolution for most cases is 1024x1024, since that's the resolution used in training. Hi, @tidus2102, does this solve your problem?

tidus2102 commented 1 month ago

@xenova @ZhengPeng7 Thank for your support. How to export at a lower input shape (instead of 1024x1024)? Can you give me some sample code?

xenova commented 1 month ago

FYI - the ORT team is looking into this: https://github.com/microsoft/onnxruntime/issues/21968

ZhengPeng7 commented 1 month ago

Hi, @xenova. Could you help me with another thing? I renamed the previous BiRefNet_T to BiRefNet_lite as https://huggingface.co/ZhengPeng7/BiRefNet_lite for users understand the meaning more easily. Could you also rename it in the transformers.js and https://huggingface.co/onnx-community/BiRefNet_T? Many thanks!!

xenova commented 1 month ago

Hi, @xenova. Could you help me with another thing? I renamed the previous BiRefNet_T to BiRefNet_lite as https://huggingface.co/ZhengPeng7/BiRefNet_lite for users understand the meaning more easily. Could you also rename it in the transformers.js and https://huggingface.co/onnx-community/BiRefNet_T?

Many thanks!!

Updated 👍 the new link is now: https://huggingface.co/onnx-community/BiRefNet_lite

ZhengPeng7 commented 1 month ago

Hi, @tidus2102, if you still face some problems there, you can also try this great project to run BiRefNet-ONNX with JavaScript and node.js, made by @adha9990.

tidus2102 commented 1 month ago

Hi @ZhengPeng7, please check this comment to see if there is any solution for .onnx memory issue (on web). Thank you.

ZhengPeng7 commented 1 month ago

Hi, @tidus2102. Thanks for letting me know about this problem. But I'm really not very familiar with web applications. But have you tried the birefnet-onnx-sample I mentioned above? If you have specific problems there, I can try to contact the original author of it.

tidus2102 commented 1 month ago

Hi @ZhengPeng7, I already tried birefnet-onnx-sample, .onnx on node works without any problem. I will keep follow up the issue on web on microsoft/onnxruntime#21968. Thank you.

ZhengPeng7 commented 1 month ago

Glad to see that! And thanks to @adha9990's nice work on that.

juntaosun commented 2 weeks ago

@xenova Could you release the onnx model of BiRefNet_lite-2K? https://huggingface.co/ZhengPeng7/BiRefNet_lite-2K