CoinCheung / pytorch-loss

label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful
MIT License
2.17k stars 374 forks source link

AM-softmax implement details #5

Closed sweetTT closed 4 years ago

sweetTT commented 4 years ago

when use am-softmax loss, do we need to add another fully connect layer before using AM-softmax loss?

such as Avg_pooling-->fc-->amsoftmax?

CoinCheung commented 4 years ago

That would depend on the dimension of you features. Amsoftmax loss is used to train embedding networks, if your feature embeddings have dimension different from the avg pooling, you should add another fc layer to adjust the embedding dimension.

CoinCheung commented 4 years ago

I am closing this. You can still leave a message if you have more to discuss.