Closed menjarleev closed 4 years ago
Can you provide some pseudo code showing how to use the loss? Further, the regularization loss can be implemented elsewhere. In AbstractLossGenerator, we can focus on only the loss related to the triplets.
Can you provide some pseudo code showing how to use the loss? Further, the regularization loss can be implemented elsewhere. In AbstractLossGenerator, we can focus on only the loss related to the triplets.
The Loss & LossGenerator has different implementations for MXNet&Pytorch. #161
For developer who is going to further extend this function, only LossGenerator is used in general_model.py
at forward
:
loss, log = self.loss_gen.get_total_loss(pos_score, neg_score)
where self.loss_gen
is initialized in __init__()
.
Users who are going to use different loss functions to train the model, just need to note for several flags:
# choose from ['Hinge', 'Logistic', 'Softplus', 'Logsigmoid', 'BCE' ]
--loss_genre
# value for negative label, choose from [0, -1]
--neg_label
# hyper-parameter for hinge loss
--margin or -m
I would like to add more loss function support including ranking loss/logistic loss, cross entropy loss, etc. As we have both PyTorch and MXNet backend support, I will abstract loss function into abstract factory method for further extension(I can do PyTorch).