Thanks @john-rocky for sharing your work, this repo is really amazing 🙂
I was trying to convert U2Net to CoreML before stumbling on your blog post. You are patching the model afterwards to add the * 255 conversion and set the output to a grayscale image, whereas in my case I add the scaling in PyTorch before the conversion and use this as output for ct.convert:
Hi there!
Thanks @john-rocky for sharing your work, this repo is really amazing 🙂
I was trying to convert U2Net to CoreML before stumbling on your blog post. You are patching the model afterwards to add the
* 255
conversion and set the output to a grayscale image, whereas in my case I add the scaling in PyTorch before the conversion and use this as output forct.convert
:In both of our models I find some inconsistencies with what I get directly from PyTorch and the CoreML model. Here's what I use to forward the inputs:
Did you observe similar discrepancies? If not, do you have any idea what I am doing wrong? 👀
CoreML output (yours)
PyTorch output (official repo)