Open SundaeOreo opened 1 year ago
@quic-akhobare could you please help answer this question. Thanks
Hi @SundaeOreo - sorry the late response. Are you still facing this issue? Also, can you give us an example code snippet to reproduce this problem? E.g. using a Keras zoo model.
Because from our tests, BN fold works. @quic-ernst - adding you to monitor the response.
@SundaeOreo There were some updates to the Keras Batch Norm Fold in AIMET 1.25.0. Could you please try that version to see if you are getting the same error? AIMET 1.25.0 is available here.
aimet version : 1.23 python version : 3.8 tensorflow version :2.4
I'm glad to see that the aimet project supports tensorflow 2.4 keras,So I did a model conversion attempt on keras I found the following issues:
I checked the source code and found that: when the model weight is the initialization weight, aimet folds the Bn layer, when loading the pre-trained model, aimet does not fold the BN layer,May I ask why you need to set this? After this setting, the quantization parameter json file output by aimet qat that loads the pre-trained model will contain the gamma, beta and other values of the BN layer, but adding overwrite to the quantize-dlc conversion cannot convert gamma, beta, etc. The beta equivalent is written into the quantized dlc model
code source:. .../anaconda3/envs/aimet/lib/python3.8/site-packages/aimet_tensorflow/keras/cross_layer_equalization.py
@staticmethod def is_folded_batch_normalization(layer: tf.keras.layers.Layer) -> bool: """ Method to check if layer is folded batchnorm or not