Is your feature request related to a problem? Please describe.
To train a classification model we can do it in two ways:
calculate log_softmax and pass it to NLLLoss (moreh_nll_loss is available)
pass logits to cross entropy loss
Cross entropy will reduce amount of programs that we need to execute and improve performance (fused op).
Describe the solution you'd like
Introduce cross_entropy_loss and cross_entropy_loss_backward that takes logits and targets (indices, int32_t or uint32_t)
Describe alternatives you've considered
Use of existing NLLLoss
Use of cross entropy with one-hot encoding implemented as a composite op
Is your feature request related to a problem? Please describe. To train a classification model we can do it in two ways:
moreh_nll_loss
is available)Cross entropy will reduce amount of programs that we need to execute and improve performance (fused op).
Describe the solution you'd like Introduce
cross_entropy_loss
andcross_entropy_loss_backward
that takes logits and targets (indices, int32_t or uint32_t)Describe alternatives you've considered