Closed Johnson-yue closed 5 years ago
I had the same error.
The original paper said
loss_g = loss_adv + loss_gdpp
But in this implement, first backward only loss_adv and then backward loss_gdpp Is right?
Doing the backward in different places doesn't change the loss.
Hi, I will train PGAN with cifar10 dataset and GDPP using this command:
But, I got error
RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time
When I check the code : backward lossGFake the compute graph has been released. So when backward GDPP loss there is not compute graph or any node will be computed. and Occur this error!!