Open vidigreen opened 4 years ago
Is it a tf keras model? Can you share a model (or script) which contains Conv1d layer with causal padding? Just provide a very simple one so we can do a unit test. Thanks.
It is a tf keras model. The following is simple example of the model.
def build_model(self, input_shape):
filters = 32
stride = 1
kernel_size = 3
dilation_size = 1
dropout = 0.2
input_layer = keras.layers.Input(shape=(input_shape[0], input_shape[1]))
conv1 = keras.layers.Conv1D(
filters=filters,
kernel_size=kernel_size,
strides=stride,
padding='causal',
dilation_rate=dilation_size,
activation='relu',
kernel_initializer=keras.initializers.RandomNormal(0, 0.01)
)(input_layer)
dropout1 = keras.layers.Dropout(dropout)(conv1)
conv2 = keras.layers.Conv1D(
filters=filters,
kernel_size=kernel_size,
strides=stride,
padding='causal',
dilation_rate=dilation_size,
activation='relu',
kernel_initializer=keras.initializers.RandomNormal(0, 0.01)
)(dropout1)
dropout2 = keras.layers.Dropout(dropout)(conv2)
output_layer = keras.layers.Activation('relu')(dropout2)
output_layer = keras.layers.Flatten()(output_layer)
output_layer = keras.layers.Dense(3, activation='softmax')(output_layer)
model = keras.models.Model(inputs=input_layer, outputs=output_layer)
return model
Thanks!
Thanks. The model is converted with is_tf_keras=True. I had no problem converting the model. But the issue is inference results are quite different between using the keras model and the onnx model.
Did you see my reply above? My unit test shows the inference result of the given model is the same. It is weird when you say "quite different", are you talking about the model you paste above? The unit test shows the inference result is the same.
I checked your test. Thanks. I trained the model I pasted and test it on Keras model and ONNX model, but get different results. Sorry if my comment caused confusion. I'm attaching the keras model model_keras.zip and the corresponding onnx model model_onnx.zip. The test dataset is X.zip and y.zip. If you unzip the models and data, you can use the following code to check the inference difference.
import numpy as np from tensorflow import keras from sklearn.metrics import accuracy_score
X_test = np.load('X.npy') y_test = np.load('y.npy') y_true = np.array([np.where(r==1.0)[0][0] for r in y_test])
model = keras.models.load_model('model.hdf5') y_pred_keras_prob= model.predict(X_test) y_pred_keras = np.argmax(y_pred_keras_prob, axis=1)
test_accuracy = accuracy_score(y_true, y_pred_keras) print(f"Keras Model Classification Accuracy is {test_accuracy}")
output_onnx_model = 'model.onnx' model = rt.InferenceSession(output_onnx_model) input_name = model.get_inputs()[0].name
y_pred_onnx_prob = [] for index in range(X_test.shape[0]): output = model.run(None, {input_name: X_test[index][newaxis, :]}) y_pred_onnx_prob.append(list(output[0][0]))
y_pred_onnx_prob = np.array(y_pred_onnx_prob) y_pred_onnx = np.argmax(y_pred_onnx_prob, axis=1)
test_accuracy = accuracy_score(y_true, y_pred_onnx) print(f"ONNX Classification Accuracy is {test_accuracy}")
for i in range(X_test.shape[0]): if not y_pred_keras[i] == y_pred_onnx[i]: print(f"{i}the different predictions are {y_pred_keras_prob[i]}:{y_pred_onnx_prob[i]}")
Thanks!
It's a similar issue to this one, https://github.com/onnx/keras-onnx/issues/89
Tried keras.models.load_model
, but the attached keras model fails to load with following error:
args = (<keras.engine.input_layer.InputLayer object at 0x0000025139A44BA8>,) kwargs = {'batch_input_shape': [None, 12, 11], 'dtype': 'float32', 'name': 'input_4', 'ragged': False, ...}, object_name = 'InputLayer', converted = [] old_name = 'input_dtype', new_name = 'dtype'
E TypeError: __init__() got an unexpected keyword argument 'ragged'
I have successfully convert my keras model which uses Conv1d layers with "causal" padding to ONNX model, but the inference result from the original keras model and the onnx model are quite different. I have been trying to debug the keras model and found that if all the other configuration kept the same but only change "causal" padding to "same" padding, the issue will be gone. Is there any trick about using keras-onnx to convert Conv1d layers with "causal" padding?
I'm using keras from tf2.0. I have installed keras-onnx from source using pip install git+https://github.com/microsoft/onnxconverter-common pip install git+https://github.com/onnx/keras-onnx