Open LifeBeyondExpectations opened 6 years ago
focal_loss_alt() is a better implementation, it uses a simpler implementation of F.binary_cross_entropy(). Besides in torch 0.4.1 binary_cross_entropy() does not perform backprop. for the weight variable which is crucial for the focal loss since the weight variable includes the modulating factor.
It seems the focal_loss_alt() function doesn't use the parameter gamma. Is that an oversight or has it been implicitly taken care of and I am missing something?
Was wondering if there is a reference for this alternate loss?
Why there are two version of focal loss methods in _"class FocalLoss(nn.Module): URL: https://github.com/kuangliu/pytorch-retinanet/blob/2199fd9711fd787ae409800a499db73e6d466fd7/loss.py" ????