Closed fnzhan closed 4 years ago
Yes, the losses will become crazily high (can up to 10^6 or 10^8).
The value of the individual loss in GANs (especially for WGAN) has no meaning.
You may focus on the w_dist
(I should've logged it on the command line as well though).
As long as the trend of its value is descending, it indicates the model is training correctly.
You may also focus on the gp_slopes
(in the histogram).
After some iterations of warmup, the distribution of the gradient should look consistent across iterations.
As long as I remember, you can start to get reasonable results around 300~500 epochs. And the images shown in the paper were obtained from a model trained with 2000 epochs (or something similar). The number of epochs looks crazy. But since that the resolution of both the generator and the discriminator is low, plus we only have 30k images, it is actually reasonable.
Lastly, due to some unfortunate events, the checkpoint is damaged and I probably won't release an official one.
Hi, @hubert0527 , I change the training set of panorama and don't change other setting. The loss become very large as show in the figure. If you have any idea about that? Besides, if you have any plan to provide the pre-trained model on Matterport3D? Looking forward to your response.