mlcommons / training_policies

Issues related to MLPerf™ training policies, including rules and suggested changes
https://mlcommons.org/en/groups/training
Apache License 2.0
92 stars 66 forks source link

data augmentation for resnet #397

Open ChenMinQi opened 3 years ago

ChenMinQi commented 3 years ago

Proposal to consider:

Now the augmentation for resnet is just RandomCropResize+Flip. I think I think something more complicated augmentation should be allowed, for example cutmix/mixup. Here is my reason. The computing power of the cluster includes the computing power of the CPU. When training resnet using TF, even if we add mixup, the training speed is not change.