chaiyujin / glow-pytorch

pytorch implementation of openai paper "Glow: Generative Flow with Invertible 1×1 Convolutions"
MIT License
505 stars 79 forks source link

Infinite values in generated images #22

Open isharifi opened 4 years ago

isharifi commented 4 years ago

For generating new images, I sample z from a zero mean and 0.6 standard deviation normal distribution and feed it to the network withreverse=True argument. But in many images, there are plenty of values greater than 1, even Inf value! How can I handle this issue? What is the problem?

Thanks.

chaiyujin commented 4 years ago

Sorry, I'm not sure what caused the problem. What does the generated images with abnormal values looks like?

isharifi commented 4 years ago

Actually, I find the part that value explosion occurs. It happens at module.py at line 53, when it scales the input by torch.exp(logs). The Inf value often happens at layer around 80 during forward pass (reverse=True). Then the generated image with negative inf would be something like it (clamped between [0,1]): image

As a result, in backward pass, the gradient would be inf too. So the training becomes impossible.

karasepid commented 3 years ago

I am getting the similar high values. @isharifi have you been able to resolve the issue?

isharifi commented 3 years ago

I am getting the similar high values. @isharifi have you been able to resolve the issue?

Unfortuntely, the problem has not been resolved. @chaiyujin, Do you have any idea?

chaiyujin commented 3 years ago

@isharifi Sorry for the delay. I'm busy with my own project. It may be caused by some invalid numerical operation, I think.

tenpercent commented 3 years ago

There may arise a numerical issue of division by zero in https://github.com/chaiyujin/glow-pytorch/blob/487a6b149295f4ec4b36e408f63604c593ff2031/glow/models.py#L93 when there are zero elements in the sigmoid output.

For me the following code snippet triggers the division by zero (running an unconditional generation):

torch.cuda.manual_seed_all(16)
glow = glow.to('cuda')
glow(reverse=True)

Or on cpu:

torch.manual_seed(37)
glow = glow.to('cpu')
glow(reverse=True)

I couldn't reproduce it running conditional generation though. A possible fix would be elementwise adding a small value to the scale before division

isharifi commented 3 years ago

There may arise a numerical issue of division by zero in

https://github.com/chaiyujin/glow-pytorch/blob/487a6b149295f4ec4b36e408f63604c593ff2031/glow/models.py#L93

when there are zero elements in the sigmoid output. For me the following code snippet triggers the division by zero (running an unconditional generation):

torch.cuda.manual_seed_all(16)
glow = glow.to('cuda')
glow(reverse=True)

Or on cpu:

torch.manual_seed(37)
glow = glow.to('cpu')
glow(reverse=True)

I couldn't reproduce it running conditional generation though. A possible fix would be elementwise adding a small value to the scale before division

Thanks. I will check if it solves the problem and let you know the result.