Open LouiValley opened 1 year ago
I try to use SoftTargetCrossEntropy as loss function.
class SoftTargetCrossEntropy(nn.Module): def __init__(self): super(SoftTargetCrossEntropy, self).__init__() def forward(self, x: torch.Tensor, target: torch.Tensor) -> torch.Tensor: loss = torch.sum(-target * F.log_softmax(x, dim=-1), dim=-1) return loss.mean()
It needs label(target) and model's output(x). For x, its dimension is [Batch, Class], But for the target, its shape is [Batch]. There will be dimensional problem. How can I deal with this?
label(target)
I try to use SoftTargetCrossEntropy as loss function.
It needs
label(target)
and model's output(x). For x, its dimension is [Batch, Class], But for the target, its shape is [Batch]. There will be dimensional problem. How can I deal with this?