tensorflow / model-optimization

A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
https://www.tensorflow.org/model_optimization
Apache License 2.0
1.48k stars 320 forks source link

Fail to fuse ReLU6 after QAT by using keras.layers.Activation('relu6') #1030

Open james77777778 opened 1 year ago

james77777778 commented 1 year ago

Describe the bug ReLU6 cannot be fused into Conv2D (Conv2D+BN+ReLU6) after QAT by using keras.layers.Activation('relu6')

Following works fine:

So the workaround is to stick to keras.layers.ReLU(6)

System information

TensorFlow version (installed from source or binary): 2.9.2 (colab default)

TensorFlow Model Optimization version (installed from source or binary): 0.7.3 (pip default)

Python version: 3.8 (colab default)

Describe the expected behavior keras.layers.Activation('relu6') should be fused into Conv2D as same as keras.layers.Activation('relu')

Describe the current behavior keras.layers.Activation('relu6') failed to be fused into Conv2D

Code to reproduce the issue Colab: https://colab.research.google.com/drive/1tuGvsuBsUiWUdks_i9glXgdUoUcFSXqI

Screenshots keras.layers.Activation('relu6') 圖片

keras.layers.ReLU(6) 圖片

Additional context It is convenient for users to configure the model's activation by keras.layers.Activation(...). It might take some time to detect the strange behavior by using keras.layers.Activation('relu6').

Thanks!

rino20 commented 1 year ago

Thanks for reporting the bug. @Xhark Could you take a look?