Open james77777778 opened 1 year ago
Describe the bug ReLU6 cannot be fused into Conv2D (Conv2D+BN+ReLU6) after QAT by using keras.layers.Activation('relu6')
keras.layers.Activation('relu6')
Following works fine:
keras.layers.ReLU(6)
keras.layers.Activation('relu')
keras.layers.ReLU()
So the workaround is to stick to keras.layers.ReLU(6)
System information
TensorFlow version (installed from source or binary): 2.9.2 (colab default)
TensorFlow Model Optimization version (installed from source or binary): 0.7.3 (pip default)
Python version: 3.8 (colab default)
Describe the expected behavior keras.layers.Activation('relu6') should be fused into Conv2D as same as keras.layers.Activation('relu')
Describe the current behavior keras.layers.Activation('relu6') failed to be fused into Conv2D
Code to reproduce the issue Colab: https://colab.research.google.com/drive/1tuGvsuBsUiWUdks_i9glXgdUoUcFSXqI
Screenshots keras.layers.Activation('relu6')
Additional context It is convenient for users to configure the model's activation by keras.layers.Activation(...). It might take some time to detect the strange behavior by using keras.layers.Activation('relu6').
keras.layers.Activation(...)
Thanks!
Thanks for reporting the bug. @Xhark Could you take a look?
Describe the bug ReLU6 cannot be fused into Conv2D (Conv2D+BN+ReLU6) after QAT by using
keras.layers.Activation('relu6')
Following works fine:
keras.layers.ReLU(6)
keras.layers.Activation('relu')
keras.layers.ReLU()
So the workaround is to stick to
keras.layers.ReLU(6)
System information
TensorFlow version (installed from source or binary): 2.9.2 (colab default)
TensorFlow Model Optimization version (installed from source or binary): 0.7.3 (pip default)
Python version: 3.8 (colab default)
Describe the expected behavior
keras.layers.Activation('relu6')
should be fused into Conv2D as same askeras.layers.Activation('relu')
Describe the current behavior
keras.layers.Activation('relu6')
failed to be fused into Conv2DCode to reproduce the issue Colab: https://colab.research.google.com/drive/1tuGvsuBsUiWUdks_i9glXgdUoUcFSXqI
Screenshots![圖片](https://user-images.githubusercontent.com/20734616/205236155-7e5e303d-cb6d-4db1-9911-6c7b3b6f1aee.png)
keras.layers.Activation('relu6')
keras.layers.ReLU(6)
Additional context It is convenient for users to configure the model's activation by
keras.layers.Activation(...)
. It might take some time to detect the strange behavior by usingkeras.layers.Activation('relu6')
.Thanks!