Open dakami opened 7 years ago
3 seems to be the reason. You don't have to train the model up to all 5000 epochs to get decent results.
Maybe update the docs? What's a good amount for this dataset?
I' training the model on celebA data and it takes about 12 minutes for a single epoch I have GeForce GTX 1050 4GB
Training faces is set up for 5,000 epochs, which on my Pascal Titan X requires a few minutes per epoch. There are three things I can imagine have gone wrong:
1 (and most likely): The dataset I'm using comes from the Google Drive link, and has more files than you were training against 2: The images in the dataset need to be downsampled 3: Epoch default is too high
Or maybe something else?