Closed Dok11 closed 3 years ago
Is im right? dataset_aug_prob — for generator and discriminator, aug_prob — for discriminator only? And why dataset_aug_prob qualt to 0 by default? Is it reccomend value?
The recommendation from the following paper is to use differentiable augmentations.
S. Zhao, Z. Liu, J. Lin, J.-Y. Zhu, and S. Han. Differentiable augmentation for data-efficient GAN training.CoRR, abs/2006.10738, 2020
From the abstract:
The performance of generative adversarial networks (GANs) heavily deteriorates given a limited amount of training data. This is mainly because the discriminator is memorizing the exact training set. To combat it, we propose Differentiable Augmentation (DiffAugment), a simple method that improves the data efficiency of GANs by imposing various types of differentiable augmentations on both real and fake samples. Previous attempts to directly augment the training data manipulate the distribution of real images, yielding little benefit; DiffAugment enables us to adopt the differentiable augmentation for the generated samples, effectively stabilizes training, and leads to better convergence.
If it so helpuful why default value is zero? https://github.com/lucidrains/lightweight-gan/blob/main/lightweight_gan/cli.py#L100
@Dok11 what woc is saying is that the new paper says the dataset augmentations are not helpful
We should be using the differentiable one (the one you have been working on) as much as possible
Thank you, now thats clear! =)
So the dataset one augments the images non-differentiably at the start, while the other augments everything going into the discriminator, differentiably, generated or not