Open SurenderHarsha opened 1 year ago
Hey @SurenderHarsha
Could you provide ONNX to reproduce your issue?
Maybe you could try the MONAI repo UNET.
Hi, I cannot provide the UNET, but I can tell you that each convolutional layer has an auto_pad op at the end of the node with SAME_UPPER, also the network was a tensorflow 1 network which I converted to ONNX using tf2onnx package and now I am looking to convert it into pytorch while making sure the outputs between the networks do not differ. The network I use is a custom production model. Another point not sure if it helps is our original tensorflow 1.13 model has 'same' padding on its convolutional layers.
I do see that in the padding.py code that auto_pad , SAME_UPPER raises a NotImplementedError.
Hey @SurenderHarsha ,
The reason for this problem is because SAME_UPPER is not implemented.
I need to take the time to reproduce this.(Maybe
It seems like you have stopped mid sentence.
We made our own custom implementation for the auto_pad same upper, this is not generalizable. So closing this issue for now.
@SurenderHarsha could you please share the code? I'll try reimplement for my model.
Ran into the same problem. Since i assume SAME_UPPER
is pads input equally as same
, i could fix the issue by changing following code snippet of the utils.padding.onnx_auto_pad_to_torch_padding()
function from:
if auto_pad == 'VALID':
return 0, None
if auto_pad in ('SAME_UPPER', 'SAME_LOWER'):
raise NotImplementedError(f'"{auto_pad}" auto_pad is not implemented')
to:
if auto_pad == 'VALID':
return 0, None
if auto_pad == 'SAME_UPPER':
return "same", None
if auto_pad in ('SAME_LOWER'):
raise NotImplementedError(f'"{auto_pad}" auto_pad is not implemented')
I tested the converted model and it runs perfectly fine.
Hi, I have a custom UNET in ONNX and I see that when I try to convert the model into pytorch, the method mentioned above is not implemented. Would it be difficult to implement? or is there an alternative solution.