cc-hpc-itwm / UpConv

Repo for our CVPR Paper: Watch your Up-Convolution: CNN Based Generative Deep Neural Networks areFailing to Reproduce Spectral Distributions
GNU General Public License v3.0
131 stars 26 forks source link

About back propagation of loss_freq #5

Closed hjbiao09 closed 4 years ago

hjbiao09 commented 4 years ago

Hello. It is a nice idea to use spectral loss, but i have some issues in your code. The spectral loss should be related to the generator. In train_spectrum.py, "img_numpy = gen_imgs[t,:,:,:].cpu().detach().numpy()" will prevent back propagation. Can you explain How spectral loss works in your code?

RicardDurall commented 4 years ago

Hello, the idea is that we use the spectral information to influence the generator. In order to do that we introduce this new term called loss_freq which measures how similar the two spectrum curves are. Once we know this value, we use it as weighting factor on the update step. Notice that we do not back propagate through the fft (that's why there is the detach).

hjbiao09 commented 4 years ago

Thank you for quick reply. Now i got how it works.

RicardDurall commented 4 years ago

you are welcome! Please close the issue if you think that it is solved. Thank you