I notice that the shuffle buffer size is 10000, and is not included in gin configurations. While the original imagenet datasets are sorted by classes, this means that the images are feeded to the unconditional GANs with class info. Could that contradicts the unconditional training, especially for SSGAN and S3GAN?
I notice that the shuffle buffer size is 10000, and is not included in gin configurations. While the original imagenet datasets are sorted by classes, this means that the images are feeded to the unconditional GANs with class info. Could that contradicts the unconditional training, especially for SSGAN and S3GAN?