Open backkon opened 3 years ago
I haven`t encountered this problem. here is my log:
[0/20][0/32] Loss_D: 2.2277 Loss_G: 15.5083 D(x): 0.5273 D(G(z)): 0.6209 / 0.0000
[0/20][10/32] Loss_D: 0.2962 Loss_G: 46.8450 D(x): 0.9128 D(G(z)): 0.0000 / 0.0000
[0/20][20/32] Loss_D: 0.0020 Loss_G: 52.9141 D(x): 0.9982 D(G(z)): 0.0000 / 0.0000
[0/20][30/32] Loss_D: 0.2574 Loss_G: 51.1907 D(x): 0.9840 D(G(z)): 0.0000 / 0.0000
[1/20][0/32] Loss_D: 0.0000 Loss_G: 51.1631 D(x): 1.0000 D(G(z)): 0.0000 / 0.0000
[1/20][10/32] Loss_D: 0.0005 Loss_G: 51.2719 D(x): 0.9995 D(G(z)): 0.0000 / 0.0000
[1/20][20/32] Loss_D: 0.0001 Loss_G: 50.5107 D(x): 0.9999 D(G(z)): 0.0000 / 0.0000
[1/20][30/32] Loss_D: 0.0000 Loss_G: 50.6817 D(x): 1.0000 D(G(z)): 0.0000 / 0.0000
[2/20][0/32] Loss_D: 0.0000 Loss_G: 50.4814 D(x): 1.0000 D(G(z)): 0.0000 / 0.0000
[2/20][10/32] Loss_D: 0.0000 Loss_G: 50.3472 D(x): 1.0000 D(G(z)): 0.0000 / 0.0000
[2/20][20/32] Loss_D: 0.0000 Loss_G: 50.2126 D(x): 1.0000 D(G(z)): 0.0000 / 0.0000
Perhaps you can try to run more epochs.
Your log is about the same as mine. Why is D(G(z)) 0/0? The second term of D(G(z)) should be getting closer and closer to 1?
D(G(z))
denotes the predicted probability of generate(fake) images.D(x)
denotes the predicted probability of real images.
I think, for the discriminator,it should try to distinguish the real and fake images.
So the D(x)
is getting closer to 1, the D(G(z))
is getting closer to 0.I got results as shown below:
Wait, I will try to find out waht`s happening.
Why do I get the log as shown below:
The outputs have always been noise, I am running according to the source code, where is the problem?