Open sumorday opened 2 years ago
Likely new versions of pytorch forbid using negative learning rate, that's the problem
On Tue, 21 Sep 2021, 02:12 sumorday, @.***> wrote:
Heyl, if I used the negative learning rate then there was a bug. invalid learning rate. what happened. Yup, just added minus to learning rate - and that's where you get adversarial training :)
optimizerD = optim.Adam(netD.parameters(), lr= -1e-4, betas=(0.5, 0.999))
In discriminator we need do gradient descent,right? but error happened.
what calls for special attention is that the transform
changed the original code from:
transform = transforms.Compose([ transforms.Pad(2), transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)), ])
To
transform = transforms.Compose([ transforms.Pad(2), transforms.ToTensor(), transforms.Normalize([0.5], [0.5])])
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/arogozhnikov/DeepMMD-GAN/issues/2, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABQGVW74I2POORGW7BXIEG3UDBEAFANCNFSM5EN4R3JA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
Lmao So sad. then how can I do....
Heyl, if I used the negative learning rate then there was a bug. invalid learning rate. what happened.
Yup, just added minus to learning rate - and that's where you get adversarial training :)
optimizerD = optim.Adam(netD.parameters(), lr= -1e-4, betas=(0.5, 0.999))
In discriminator we need do gradient descent,right? but error happened.