megvii-research / FQ-ViT

[IJCAI 2022] FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer
Apache License 2.0
301 stars 48 forks source link

A question about Log2Quantization when Activations or Weights equal to zeros #13

Closed youdutaidi closed 2 years ago

youdutaidi commented 2 years ago

I appreciate your Log2Quantization methods in FQ-Vit but I have a questions that: when activations and weights equal to zero, we can not log2(0), and when they are negtive numbers ,we can not opreate it , either. But it seems in the code it did not notice that problem.

lz02k commented 2 years ago

Actually, the "Log2Quantization" method is only used for Softmax activation with a value range of (0, 1).