gmalivenko / onnx2keras

Convert ONNX model graph to Keras model format.
MIT License
195 stars 116 forks source link

Replace all lambda functions with custom keras layers #4

Open SleepProgger opened 5 years ago

SleepProgger commented 5 years ago

Using lambdas can lead to the model not being usable on other python versions / systems when saving and loading the result as a h5 file (see: https://github.com/keras-team/keras/issues/9595 ).

I suggest replacing all current lambda layers with keras custom layers. This would mean this project would have to also generate a python file with all custom layer implementations or just ship one with all custom layers (I'd prefer the first one). It is possible to register custom layers with keras so shipping a function to register all used custom layers would make sense instead of having the user map all custom layers on keras.models.load_model time.

Otherwise users would have to ship the onnx file and run the conversion for each use which isn't really a nice way to go about it IMO.

ibadr commented 3 years ago

Revisiting this, since one of the main use cases of onnx2keras for our group is to go PyTorch/CoreML -> ONNX -> Keras -> Tensorflow.js. However, the latter will not be able to run Python Lambda functions. What would you recommend in order to be able to avoid Lambda functions altogether?

pedrofrodenas commented 8 months ago

In my case a had a problem with the Clip layer, I realized that Cliping equals applying a Relu activation function so I replace in operation_layers.py in line 37 the following:

    relu_layer = keras.layers.ReLU(max_value=ensure_numpy_type(layers[node.input[2]]).astype(int))

    # def target_layer(x, vmin=0, vmax=ensure_numpy_type(layers[node.input[2]]).astype(int)):
    #     import tensorflow as tf
    #     return tf.clip_by_value(x, vmin, vmax)
    # layer = keras.layers.Lambda(target_layer, name=keras_name)
    # lambda_func[keras_name] = target_layer

    layers[node_name] = relu_layer(input_0)

That avoids using lambda layer