A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
1.48k
stars
320
forks
source link
Is there a way to do symmetric quantization for activations in post-training quantization #801
Open
Lemiron24 opened 2 years ago
https://tensorflow.google.cn/lite/performance/quantization_spec from this document we can see the post-training quantization do asymmetric quantization for activations, do symmetric quantization for wieghts.
I follow this guide: https://tensorflow.google.cn/lite/performance/post_training_integer_quant do full int8 quantization for my network mode is OK, as the above said the weights is symmetric , the feature is asymmetric.
My question is I also want the activations also is symmetric, How to do that?
thanks!