Closed ronghuaiyang closed 5 years ago
Yes, they are correct. Make sure you normalize twice: (1) normalize the features before averaging (2) normalize the new weight vector after averaging.
thank you very much for the relay !
hi,seasonSH another question,you said (2)normalize the new weight vector after averaging, do I need to normalize the whole weight vector or only to normalize the weight in the minibatch and update them ?
thanks for the excellent work ! I try to implement your work in pytorch. When I implement the weight imprinting, i set the learning rate of the classifier layer to 0. then i use the follow steps: 1、calculate the features and normalize them as the weight vecotors. 2、replace the old weight vector 3、calculate the loss use am softmax 4、backward gradients and update weights except the classifier layer.
is it correct ? i am not familiar with the tensorflow , so i don't understand your code very well. please give me some advices, thanks !