Open qdlmcfresh opened 8 months ago
https://github.com/google/qkeras/blob/dc0bce96c269750ca21f8fc0c44864011d7cdc43/qkeras/quantizers.py#L1287 states that leaving alpha at the default value of None should result in alpha being set to 1, but that's not the case which leads to unexpected behavior in quantization. The result is quantized values that are out of range a lot.
alpha
None
1
Manually setting alpha to 1 in the constructor leads to the expected results after quantization
https://github.com/google/qkeras/blob/dc0bce96c269750ca21f8fc0c44864011d7cdc43/qkeras/quantizers.py#L1287 states that leaving
alpha
at the default value ofNone
should result inalpha
being set to1
, but that's not the case which leads to unexpected behavior in quantization. The result is quantized values that are out of range a lot.Manually setting alpha to 1 in the constructor leads to the expected results after quantization