Closed Mypathissional closed 3 years ago
tf2onnx is only designed to convert models for inference, not training, so it does some conversions that break training behavior (we also remove dropout layers). But it should be possible to modify the code to do what you want. Try disabling the back_to_back optimizer by removing this line: https://github.com/onnx/tensorflow-onnx/blob/186b9540d705188de34faffd119aa6a4f5b150c0/tf2onnx/optimizer/__init__.py#L36
Hi,
I am trying to convert efficieinetnet-v2 from Tensorflow to Onxx and then to another framework. And I have noticed that most of the batch normalization layers are fused within the convolutional layers as a result of the constant folding. I am wondering whether it is possible to export the batch norm layers as well since I want to have a trainable model in another framework.
Any help would be appreciated, Best Regards, Maria