dansuh17 / segan-pytorch

SEGAN pytorch implementation https://arxiv.org/abs/1703.09452
GNU General Public License v3.0
106 stars 32 forks source link

D loss is very low but G loss is very high #32

Open XCYu-0903 opened 3 years ago

XCYu-0903 commented 3 years ago

when code ran into about epoch 4 and beyond, the loss of D downed lower than 0.001 but the loss of G was very high about 100. I don't know whether is key that I change the batch_size to 32, I wish get your answers, thx!

hajiejue commented 3 years ago

when code ran into about epoch 4 and beyond, the loss of D downed lower than 0.001 but the loss of G was very high about 100. I don't know whether is key that I change the batch_size to 32, I wish get your answers, thx!

Have you solved this problem? I have also same problem. My batch_size to 80