Open jikechao opened 1 year ago
For the layer Embedding, TVM and Keras have different inference results based on the same input data.
The inference result for TVM and Keras should be consistent.
What actually happened
Any environment details, such as: Operating System, TVM version, etc
import tvm import tvm.relay as relay import numpy as np from tensorflow import keras from tensorflow.keras import layers, models input_shape = (1, 2) input_data = np.random.randint(10, size=input_shape) x = layers.Input(shape=input_shape[1:], dtype='int32') layer = keras.layers.Embedding(10, 4) layer.set_weights(layer.get_weights()) y = layer(x) model = models.Model(x, y) model.summary() res_keras = model(input_data) shape_dict = {'input_1': input_shape} mod, params = relay.frontend.from_keras(model, shape_dict) print(mod) with tvm.transform.PassContext(opt_level=3): model = relay.build_module.create_executor("graph", mod, tvm.cpu(0), 'llvm', params).evaluate() test_x_tvm = input_data res_tvm = model(tvm.nd.array(test_x_tvm.astype('int32'))).numpy() np.testing.assert_allclose(res_keras, res_tvm, atol=1e-3, rtol=1e-3)
cc @shingjan
This bug was triggered when input_dtype= 'int32'. If we use dtype='float32', TVM and Keras have the same inference results.
For the layer Embedding, TVM and Keras have different inference results based on the same input data.
Expected behavior
The inference result for TVM and Keras should be consistent.
Actual behavior
What actually happened
Environment
Any environment details, such as: Operating System, TVM version, etc
Steps to reproduce
Triage
cc @shingjan