YirongMao / softmax_variants

PyTorch code for softmax variants: center loss, cosface loss, large-margin gaussian mixture, COCOLoss, ring loss
252 stars 49 forks source link

How to understand the centers shown in the LMCL_loss? #4

Open taylover-pei opened 5 years ago

taylover-pei commented 5 years ago

You have done a great work.

But I have a question about the LMCL_loss. How to understand the centers as show bellow:

image

I think thay are stand for the parameters of the last fully connected layer. But I wonder how to upodate them. In your code, I think they are fixed values, are they?

Looking forward to your reply.

YirongMao commented 5 years ago

In here, centers are nn.Parameter(). They will updated by SGD or SGD variant methods. More specifically, the gradient of centers will be computed via 'loss.backward' as shown in https://github.com/YirongMao/softmax_variants/blob/master/train_mnist_LMCL.py#L71

You can also print out net.parameters() to check whether centers are optimized in line https://github.com/YirongMao/softmax_variants/blob/master/train_mnist_LMCL.py#L117