Closed dypromise closed 4 years ago
This can happen sometimes... It also happens to me. Your model will eventually converge as the learning rate decays (after epochs 50 & 100 in the default settings). It happens because the initial learning rate is relatively high, but that high initial learning rate is needed to get good final results.
You can try to decay the learning rate earlier (like in epochs 25 & 75) to make the model converge faster. This can be changed here: https://github.com/royorel/Lifespan_Age_Transformation_Synthesis/blob/29f5afade1376ef54ce446d11497c22a888d28f9/options/train_options.py#L25
Hi, royorel! Thank you for sharing the excellent work! I want to reproduce the pretrained model's result, I encountered some issues. I use FFHQ 1024x1024 data, aligned by the method in 'download_ffhq_aging' script, and produced parsings image by your deeplab script. I train the model with female data use parsing to cover background, but cant converge a bit before 50epoch. loss_D_real and loss_D_fake decreasing to 0 fast. what do you think the reason? Hope for your suggestions. the loss and training image: