facebookresearch / kill-the-bits

Code for: "And the bit goes down: Revisiting the quantization of neural networks"
Other
636 stars 124 forks source link

About the results on semi-supervised ResNet-50 #38

Closed una-dinosauria closed 3 years ago

una-dinosauria commented 3 years ago

Hello @pierrestock,

Congratulations on your amazing work! We have been working on vector quantization of neural networks as well, and recently published our findings at https://arxiv.org/abs/2010.15703.

As we mention in our paper, we have trouble reproducing the accuracy of the semi-supervised uncompressed Resnet-50 model reported on your paper:

image

After downloading the model from https://github.com/facebookresearch/semi-supervised-ImageNet1K-models, instead of 79.3% accuracy, we obtain 78.72%.

I know that you are not the author of that paper, but we were wondering if you could please verify whether you actually obtained the reported accuracy with the uncompressed model. If so, could you please share that model with us? If you did not, could you please let us know so we can amend our paper (and you can amend yours)?

Cheers,

pierrestock commented 3 years ago

Hi una-dinosauria ,

Thanks for your interest in your work and congratulations on your paper! I particularly like the idea of grouping vectors together to ease the quantization process. In the same vein, we have another technique to train the uncompressed network with quantization noise to introduce redundancy, see here. On a side note, we also worked on functional equivalence (rescalings and permutations) here.

And also thanks for your message. The pretrained network used to obtain our result is apparently not the same that was open-sourced. Note that in terms of timeline, the paper was published before the semi-supervised team open-sourced its models. Therefore, we had access to an internal model that was not released, for some reason. I will amend the paper saying that the model was internal and not released in order to explain the discrepancy. In the future, I will use the published models in order to favour fair comparison with the state-of-the-art.

Hope this helps and see you in the quantization world!

Pierre

una-dinosauria commented 3 years ago

Thanks for your quick response, @pierrestock, and for the pointers to your work -- very cool stuff!