vacancy / Synchronized-BatchNorm-PyTorch

Synchronized Batch Normalization implementation in PyTorch.
MIT License
1.5k stars 189 forks source link

`np.fmax(npa, 1e-5))` does not consider that npa is negative #13

Closed acgtyrant closed 6 years ago

acgtyrant commented 6 years ago

See https://github.com/vacancy/Synchronized-BatchNorm-PyTorch/blob/master/sync_batchnorm/unittest.py#L28

If npa is negative at all, then np.fmax will return 1e-5, even the abs of npa is greater than 1e-5.

acgtyrant commented 6 years ago

By the way, np.fmax(npa, 1e-5)) is use for when npa is 0 at all?

vacancy commented 6 years ago

Thanks for the reporting. I think it should be np.fmax(np.abs(npa), 1e-5).max(). I will fix it later.