@CVPR2018: Efficient unrolling iterative matrix square-root normalized ConvNets, implemented by PyTorch (and code of B-CNN,Compact bilinear pooling etc.) for training from scratch & finetuning.
Hi, I am reading about your iSQRT paper and i think it is quite interesting. However, I am confused about the usage of SVMs.
You wrote "After finetuning, the outputs of iSQRT-COV layer are ℓ2−normalized before inputted to train k one-vs-all linear SVMs with hyperparameter C = 1" in your paper but i didn't find it in your released code.
@CupidJay We perform SVM classifier for fair comparison with existing methods. In fact, it improve performance very marginal (<=0.1%). So SVM is not a required option.
Hi, I am reading about your iSQRT paper and i think it is quite interesting. However, I am confused about the usage of SVMs. You wrote "After finetuning, the outputs of iSQRT-COV layer are ℓ2−normalized before inputted to train k one-vs-all linear SVMs with hyperparameter C = 1" in your paper but i didn't find it in your released code.