Closed tidus2102 closed 1 month ago
Thanks for letting me know about this problem. I'm not familiar with the web app, but I will look into it tomorrow and reply to you then.
Have you tried following the example codes in my HF space demo? You can click the Use via API
at the bottom of the page and follow the instructions there (with an uploaded image or a URL to the online image):
Thanks for your reply. I already tried your demo spaces and it works. I only ask about issue when using .onnx file on js (using transformers.js). I haven't tried BiRefNet .onnx on python but saw you and someone have tried it here.
About .onnx on js, I tested the briaai/RMBG-1.4, it works without any issues. So I'm not sure if BiRefNet .onnx file got some issues. I haven't tried create .onnx file again from pytorch file on my local (I'm using Macbook M1).
Please use the official ONNX files (you can download them from the GitHub release or GDrive folder given in README). I'm not saying their ONNX files are wrong, but I don't know whether there is any problem.
Hi, Hung, I tried to learn about this problem. I also saw that you raised a discussion in the ONNX-community's HF repo. I really appreciate it, but I'm really not good at (even a noob) web deployment (things like transformers.js). Very sorry for that. If I'm not so busy with other work in recent days, I'll try to learn and solve it step by step.
Could you send me your demo projects, which I could build from zero, so I can test the error you encounter? Many thanks! If I solve it by myself, you'll get notified here. Finally, let me add a need-help badge to this issue to see if any other guy is good at it and do some help.
hi, sorry for the late reply, I tried to find out more about this issue after got your reply.
First, the .onnx file on onnx-community/BiRefNet_T is the same with your .onnx on github releases.
Second, here is my demo code: https://codesandbox.io/p/sandbox/birefnet-onnx-5c7vv8
NOTE:
Demo run:
Check Chrome dev tools > Console tab to see the model config (both models) after model loaded: onnx-community/BiRefNet_T .onnx config:
{
"model_type": "swin",
"is_encoder_decoder": false,
"normalized_config": {
"model_type": "swin",
"is_encoder_decoder": false
}
}
briaai/RMBG-1.4 .onnx config:
{
"model_type": "SegformerForSemanticSegmentation",
"is_encoder_decoder": false,
"_name_or_path": "briaai/RMBG-1.4",
"architectures": [
"BriaRMBG"
],
"auto_map": {
"AutoConfig": "MyConfig.RMBGConfig",
"AutoModelForImageSegmentation": "briarmbg.BriaRMBG"
},
"custom_pipelines": {
"image-segmentation": {
"impl": "MyPipe.RMBGPipe",
"pt": [
"AutoModelForImageSegmentation"
],
"tf": [],
"type": "image"
}
},
"in_ch": 3,
"out_ch": 1,
"torch_dtype": "float32",
"transformers_version": "4.38.0.dev0",
"normalized_config": {
"model_type": "SegformerForSemanticSegmentation",
"is_encoder_decoder": false
}
}
Error when run predict with BiRefNet .onnx
Check Chrome dev tools > Application tab > Cache storage > transformers-cache for .onnx model files load from huggingface:
Third:
Summary: the BiRefNet .onnx only got error with js, I see significant differences when log 2 models config. Hope this helps. Thank you.
Hi there 👋 The model (with the example code) runs fine in Node.js, so if I had to guess, it is an out of memory issue due to browser resource constraints. I think there are two fixes we can consider:
Thanks for your kind explanation, @xenova ! BTW, the default and best resolution for most cases is 1024x1024, since that's the resolution used in training. Hi, @tidus2102, does this solve your problem?
@xenova @ZhengPeng7 Thank for your support. How to export at a lower input shape (instead of 1024x1024)? Can you give me some sample code?
FYI - the ORT team is looking into this: https://github.com/microsoft/onnxruntime/issues/21968
Hi, @xenova. Could you help me with another thing? I renamed the previous BiRefNet_T
to BiRefNet_lite
as https://huggingface.co/ZhengPeng7/BiRefNet_lite for users understand the meaning more easily. Could you also rename it in the transformers.js and https://huggingface.co/onnx-community/BiRefNet_T?
Many thanks!!
Hi, @xenova. Could you help me with another thing? I renamed the previous
BiRefNet_T
toBiRefNet_lite
as https://huggingface.co/ZhengPeng7/BiRefNet_lite for users understand the meaning more easily. Could you also rename it in the transformers.js and https://huggingface.co/onnx-community/BiRefNet_T?Many thanks!!
Updated 👍 the new link is now: https://huggingface.co/onnx-community/BiRefNet_lite
Hi, @tidus2102, if you still face some problems there, you can also try this great project to run BiRefNet-ONNX with JavaScript and node.js, made by @adha9990.
Hi @ZhengPeng7, please check this comment to see if there is any solution for .onnx memory issue (on web). Thank you.
Hi, @tidus2102. Thanks for letting me know about this problem. But I'm really not very familiar with web applications. But have you tried the birefnet-onnx-sample I mentioned above? If you have specific problems there, I can try to contact the original author of it.
Hi @ZhengPeng7, I already tried birefnet-onnx-sample, .onnx on node works without any problem. I will keep follow up the issue on web on microsoft/onnxruntime#21968. Thank you.
Glad to see that! And thanks to @adha9990's nice work on that.
@xenova Could you release the onnx model of BiRefNet_lite-2K? https://huggingface.co/ZhengPeng7/BiRefNet_lite-2K
Hi, I tried to test .onnx on web (CPU) using onnx-community/BiRefNet_T. Here is index.html:
Check Chrome dev tools > Application tab > Cache storage, the .onnx file is loaded & cached without problem:
But I got error at predict code line:
I try the other input images, got the same issue. Please help to check it. Thank you.