Open lovodkin93 opened 3 years ago
Hi @lovodkin93 , can you share a colab so we can reproduce easily?
Also please let us know which TF and TF-MOT versions you're using.
It's an issue when you use tf.nn.relu instead of tf.keras.layers.ReLU. (It converted to TFLambdaOp which has a trouble with current QAT API.)
Would you please use tf.keras.layers.ReLU if it's okay?
So I tried to replace every x = Activation(activations.relu)(x)
line with x = Activation(ReLU)(x)
, and now I get the following error (which is the reason I worked with x = Activation(activations.relu)(x)
in the first place):
tensorflow.python.framework.errors_impl.OperatorNotAllowedInGraphError: using a `tf.Tensor` as a Python `bool` is not allowed in Graph execution. Use Eager execution or decorate this function with @tf.function.
Do you happen to know why this might occur? Thanks!
Hi @Xhark ,
Is it possible that this same error might be happening using tensorflow.keras.activations.sigmoid
?
If so, what replacement should I employ? Since, there is not a tf.keras.layers.sigmoid
or such. Using a tf.keras.layers.Lambda
perhaps?
I had a similar issue with tf.split
.
I could overcome the error by wrapping tf.split in a keras layer:
@keras.saving.register_keras_serializable(package="MyLayers", name="SplitLayer")
class SplitLayer(keras.layers.Layer):
def __init__(self, num_or_size_splits, axis, **kwargs):
super(SplitLayer, self).__init__(**kwargs)
self.num_or_size_splits = num_or_size_splits
self.axis = axis
def call(self, inputs):
return tf.split(inputs, self.num_or_size_splits, axis=self.axis)
def get_config(self):
config = super(SplitLayer, self).get_config()
config.update({
'num_or_size_splits': self.num_or_size_splits,
'axis': self.axis,
})
return config
tf version: 2.15.1 keras version: 2.15.0 tfmot version: 0.7.5
Hello, I am trying to perform a QAT on a ResNet50 network with BN layers, and I keep getting the following error:
I tried to isolate each of the BN layers, and it appears they all cause the same error.
Here is the code I am trying to run:
What am I missing?