Closed auroua closed 5 years ago
Hi,
Thanks for your interest. |S| and |B|sum in denominator are just for normalizing the loss. Basically, it simply means that the loss is averaged over the number of peaks and number of classes in S. In pytorch, the loss has some arguments to control whether to do average or not. Please refer to https://pytorch.org/docs/stable/nn.html#torch.nn.BCEWithLogitsLoss
I can not understand the
Lsp+
loss. The loss is computed between the pseudo ground-truth and the generated density mapDc
. TheBCE
loss force the density map to peak at the pseudo ground-truth peak points. How to understand the effect of|S|
and|B|sum
in denominator.