jfzhang95 / pytorch-deeplab-xception

DeepLab v3+ model in PyTorch. Support different backbones.
MIT License
2.92k stars 783 forks source link

Batch average over Loss #55

Open AlanStark opened 5 years ago

AlanStark commented 5 years ago

Hello @jfzhang95,

In Loss.py, I find the loss is averaging over batch, as below.
if self.batch_average: loss /= n However, I think the loss has already been averaged by setting Size_Average=True. Is there any specific concern on it?

In addition, when batch_average is True, the model has a problem of converging in my project. I guess the reason might be the loss is way too small.

Thanks ahead.

dhpollack commented 5 years ago

I noticed this as well. Also the size_average parameter has been deprecated in favor of reduction="mean"