Now the augmentation for resnet is just RandomCropResize+Flip.
I think I think something more complicated augmentation should be allowed, for example cutmix/mixup.
Here is my reason. The computing power of the cluster includes the computing power of the CPU. When training resnet using TF, even if we add mixup, the training speed is not change.
Proposal to consider:
Now the augmentation for resnet is just RandomCropResize+Flip. I think I think something more complicated augmentation should be allowed, for example cutmix/mixup. Here is my reason. The computing power of the cluster includes the computing power of the CPU. When training resnet using TF, even if we add mixup, the training speed is not change.