facebookresearch / pytorch_GAN_zoo

A mix of GAN implementations including progressive growing
BSD 3-Clause "New" or "Revised" License
1.62k stars 271 forks source link

GDPPLoss can become NaN #89

Closed Quasimondo closed 4 years ago

Quasimondo commented 5 years ago

It's probably a rare edge case, but I noticed that in the normalization function a division by zero can occur if minV == maxV in which case the returned loss will become NaN and probably poison the rest of the model even if it just happens once.

So I'd propose to add a safeguard (I tried adding a small epsilon first, but that lead to wrong results)

def normalize_min_max(eigVals):
        minV, maxV = torch.min(eigVals), torch.max(eigVals)
        if minV == maxV:
            return eigVals
        return (eigVals - minV) / (maxV - minV)
Molugan commented 4 years ago

Sorry for the delay. Good point indeed I should make a PR.