changlin31 / DNA

(CVPR 2020) Block-wisely Supervised Neural Architecture Search with Knowledge Distillation
235 stars 35 forks source link

Do you use Catout in training and transfer learning process? #6

Closed 5663015 closed 4 years ago

5663015 commented 4 years ago

Could you show us the details of data augmentation? I have some trouble recurring transfer learning and DNA_c, thank you very much !

jiefengpeng commented 4 years ago

We disabled cutout in Imagenet training and enabled it in transfer learning. We use AutoAugment for data augmentation when we train Imagenet. More details can be found in: Cubuk, E. D., Zoph, B., Mane, D., Vasudevan, V., and Le, Q. V. Autoaugment: Learning augmentation policies from data. CVPR, 2019. And there are some tips in transfer learning. https://github.com/jiefengpeng/DNA/issues/3#issuecomment-564472641

5663015 commented 4 years ago

AutoAugment is also enabled in transfer learning? @jiefengpeng

jiefengpeng commented 4 years ago

Not yet, we use regular augmentation like normalize, random crop, resize, horizontal flip and colorjitter since the transfer dataset is simple. But it may help using AA in transfer learning. One more thing, make sure that the mean and std in normalization are: IMAGENET_DEFAULT_MEAN = (0.485, 0.456, 0.406) IMAGENET_DEFAULT_STD = (0.229, 0.224, 0.225)