Mxbonn / INQ-pytorch

A PyTorch implementation of "Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights"
164 stars 27 forks source link

The logic is wrong in 'example' code? #8

Closed cool-ic closed 5 years ago

cool-ic commented 5 years ago

Sorry to bother but I think the logic of example is not right.

The example does quantization every epoch while every quantization operation changes the iterative_step. So it means that every epoch, the iterative_step goes to next level, which I think is wrong?

I think it is ok to do quantization every epoch(also must with the same Ts) but it should not change the iterative_step. The iterative_step should only be changed when all epochs of current iterative_step are done. Then iterative_step can be changed and a new set of epochs begin.

Mxbonn commented 5 years ago

You're correct! Somehow I messed up the example code. I pushed a fix to repository. The reported accuracy and pretrained model are still correct however as I used a different file to generate those.

Thanks for spotting the mistake!