Open WenWeiZhao opened 1 year ago
Nice question! As we all know, a re-parameterized model is less quantization-friendly than a regular one. In our work, we have not evaluated the quantization performance of RepGhostNet yet. It would be interesting to work on this.
On the other hand, we note that:
Hi~ Great work there! What I want to ask is whether RepGhost has suffered a serious loss after INT8 quantization? Or how do you solve quantitative problems? Thanks~