Closed SWu closed 1 month ago
FWIW, it appears that the current cross_entropy_with_logits
is actually what pytorch calls NLLLoss
(negative log-likelihood loss), so a workaround to actually get cross_entropy_with_logits
is to cross_entropy_with_logits(log_softmax(y_pred), y_target)
.
bump
this is still incorrect in latest mainline: https://github.com/apache/tvm/blob/main/python/tvm/relay/op/nn/_nn.py#L1012
can we at least delete this operator in the meantime to avoid confusion by people using it expecting it to be correct?
The implementation of
cross_entropy_with_logits
seems to be incorrect: https://github.com/apache/tvm/blob/main/python/tvm/relay/op/nn/_nn.py#L912It should be something like:
However, if I naively try to make the change above, I get the following error when trying to compile a model using it: