Closed VeloDC closed 4 years ago
I’m so sorry for the late response and inconvenience. I have a time to look back my code and paper, it might be my mistake by just a typo. Could you just in case test the learning rate of 2e-4 (default value) instead of 1e-5?
@VeloDC Do you reproduce the results? I still couldn't reproduce the results with lr=2e-4.
@PhoneSix , sorry for the late reply, yes, we used 2e-4 and reproduced. Visually, the results sometimes were a bit different but with 2e-4 and 10+10 epochs we obtained the same mAP numbers
@VeloDC Thanks!
I am trying to replicate your DT step (VOC2<clipart/comic/watercolor>) for my research, but quality of transferred images I obtain from CycleGAN is substantially inferior to quality of the ones you provide, which impact final results.
I am using this implementation of CycleGAN, which should be the same you used according to the readme.
I am using the full trainval sets of VOC2007 + VOC2012 (trainA) and one among clipart1k, comic2k and watercolor2k (trainB).
I follow the hyperparameters reported in the paper: I run cyclegan for 20 epochs (10 + 10 linear lr decay to zero) with an initial learning rate of 1e-5 using adam optimizer. All other hyperparameters are left at their default values.
Perhaps there's something I am missing, maybe I am using the wrong datasets or the wrong set of hyperparameters, are the steps I described consistent with yours?