zhangchbin / OnlineLabelSmoothing

The official code for the paper "Delving Deep into Label Smoothing", IEEE TIP 2021
https://arxiv.org/abs/2011.12562
MIT License
73 stars 11 forks source link
adversarial-attacks classification deep-learning fine-grained-classification noisy-labels pytorch regularization robustness tip2021

Online Label Smoothing

The code for the paper "Delving Deep into Label Smoothing"
I have only cleaned the code on the fine-grained datasets. Since I am not currently in school, I have not tested it. So if there are any bugs, please feel easy to contact me (zhangchbin AATT gmail Ddot com). avatar

Citation

@ARTICLE{zhang2021delving,
  author={Zhang, Chang-Bin and Jiang, Peng-Tao and Hou, Qibin and Wei, Yunchao and Han, Qi and Li, Zhen and Cheng, Ming-Ming},
  journal={IEEE Transactions on Image Processing}, 
  title={Delving Deep into Label Smoothing}, 
  year={2021},
  volume={30},
  number={},
  pages={5984-5996},
  doi={10.1109/TIP.2021.3089942}}

Performance

Classification for fine-grained datasets

avatar

Model ensemble on CIFAR-100

avatar

TODO

Requirements

pytorch >= 1.0
torchvision
numpy
tensorboardX
apex
tqdm
efficientnet_pytorch
SAN_network

efficientnet_pytorch
SAN network

Data Preparation

Download all datasets to the data directory, note that we modify the division for datasets as shown in files in the data directory:

Train and Validate

  1. download the ImageNet pretrained model to checkpoint
    MobileNet-v2, ResNet-50, Res2Net

  2. train the model with online label smoothing:

    CUDA_VISIBLE_DEVICES=1 python main.py \
    --mode train \
    --pretrained_model ./checkpoint/mobilenet_v2-b0353104.pth \
    --epochs 100 \
    --lr 0.01 \
    --arch mobilenetv2 \
    --dataset cub \
    --method ols \
    --batch_size 64 \

    (optional) test the model ensemble performance:

    python main.py \
    --mode ensemble \
    --ensemble 'runs/mobilenetv2_cub_ols/20.pth' 'runs/mobilenetv2_cub_ols/60.pth' \
    --epochs 100 \
    --lr 0.01 \
    --arch mobilenetv2 \
    --dataset aircraft \
    --method ols \
    --batch_size 64 \

Train on CIFAR

```
cd cifar
sh train_cifar_imagenetresnet34.sh
sh train_cifar_resnext29_2.sh
````

Other realated implementation

Thanks to the re-implementation in Kurumi233 and ankandrew