ENOT-AutoDL / onnx2torch

Convert ONNX models to PyTorch.
Apache License 2.0
595 stars 69 forks source link

SAME_UPPER autopad method not implemented. #144

Open SurenderHarsha opened 1 year ago

SurenderHarsha commented 1 year ago

Hi, I have a custom UNET in ONNX and I see that when I try to convert the model into pytorch, the method mentioned above is not implemented. Would it be difficult to implement? or is there an alternative solution.

JohnMasoner commented 1 year ago

Hey @SurenderHarsha

Could you provide ONNX to reproduce your issue?

Maybe you could try the MONAI repo UNET.

SurenderHarsha commented 1 year ago

Hi, I cannot provide the UNET, but I can tell you that each convolutional layer has an auto_pad op at the end of the node with SAME_UPPER, also the network was a tensorflow 1 network which I converted to ONNX using tf2onnx package and now I am looking to convert it into pytorch while making sure the outputs between the networks do not differ. The network I use is a custom production model. Another point not sure if it helps is our original tensorflow 1.13 model has 'same' padding on its convolutional layers.

SurenderHarsha commented 1 year ago

image I do see that in the padding.py code that auto_pad , SAME_UPPER raises a NotImplementedError.

JohnMasoner commented 1 year ago

Hey @SurenderHarsha ,

The reason for this problem is because SAME_UPPER is not implemented.

I need to take the time to reproduce this.(Maybe

SurenderHarsha commented 1 year ago

It seems like you have stopped mid sentence.

SurenderHarsha commented 1 year ago

We made our own custom implementation for the auto_pad same upper, this is not generalizable. So closing this issue for now.

LamOne1 commented 1 year ago

@SurenderHarsha could you please share the code? I'll try reimplement for my model.

Ralfons-06 commented 3 months ago

Ran into the same problem. Since i assume SAME_UPPER is pads input equally as same, i could fix the issue by changing following code snippet of the utils.padding.onnx_auto_pad_to_torch_padding() function from:

if auto_pad == 'VALID':
    return 0, None

if auto_pad in ('SAME_UPPER', 'SAME_LOWER'):
    raise NotImplementedError(f'"{auto_pad}" auto_pad is not implemented')

to:

if auto_pad == 'VALID':
    return 0, None

if auto_pad == 'SAME_UPPER':
    return "same", None

if auto_pad in ('SAME_LOWER'):
    raise NotImplementedError(f'"{auto_pad}" auto_pad is not implemented')

I tested the converted model and it runs perfectly fine.