Open talcs opened 4 years ago
Hi,
Update:
When I changed the x = x.mean([2,3])
Pytorch operation in the network's forward()
method into x = self.apool(x)
(where self.apool
has been set in the __init__
method to be nn.AdaptiveAvgPool2d((1,1))
), then:
When I converted it into ONNX, it has been represented as a GlobalAveragePool
operation, rather than ReduceMean
onnx2keras has converted it to GlobalAveragePooling2D and I could export the model as a TF frozen graph!
The HDF5 Keras model visualized in Netron after importing it from Keras (with change_ordering = True):
@talcs could you close this ? Seems it was not an issue.
I think I found a solution:
GlobalAveragePooling has an option to keep dims so that you can avoid using lambdas, here is the edited convert_global_avg_pool function,
def convert_global_avg_pool(node, params, layers, lambda_func, node_name, keras_name):
"""
Convert GlobalAvgPool layer
:param node: current operation node
:param params: operation attributes
:param layers: available keras layers
:param lambda_func: function for keras Lambda layer
:param node_name: internal converter name
:param keras_name: resulting layer name
:return: None
"""
logger = logging.getLogger('onnx2keras.global_avg_pool')
input_0 = ensure_tf_type(layers[node.input[0]], layers[list(layers)[0]], name="%s_const" % keras_name)
global_pool = keras.layers.GlobalAveragePooling2D(data_format='channels_first', name=keras_name, keepdims=True)
layers[node_name] = global_pool(input_0)
Hi,
Thank you for this very friendly and beautiful tool.
I was trying to convert a Pytorch model that runs global-average-pooling after the last conv layer and before the first fc layer (In Pytorch's
forward
method, it is implemented by running the linex = x.mean([2,3])
). Note that it is the same as done in Pytorch's official MobileNet-V2 implementation.The problem is that after converting the Pytorch model into ONNX, keras2onnx converts the average operation into a Lambda layer. The problem with the lambda layer is that I cannot export it as a Tensorflow frozen graph.
The ONNX input file visualized in Netron - the operation is ReduceMean:
The Keras HDF5 model after importing it from ONNX using onnx2keras, visualized in Netron - the operation is Lambda with something that looks like base64 encoded content:
x = x.mean([2,3])
command by a PytorchAdaptiveAvgPool2d
layer with a 1x1 output size?Thanks!