Closed menghuaa closed 2 years ago
Hi Menghua,
The normalization is done to calculate multi-resolution similarity of features. To ensure that each layer has equal contribution in the concatenated multi-resolution feature when we use it for SimCLR similarity. Details of normalization and then concatenation, is discussed in our paper https://arxiv.org/pdf/2112.01402.pdf, in Section 4.3. Additionally ablations of using features without normalization is shown in section D of the Appendix.
Hi, I see that you have used torch.norm for the output of the neural network. The purpose of this operation is to prevent inf in the calculation of comparative losses?