LiJunnan1992 / DivideMix

Code for paper: DivideMix: Learning with Noisy Labels as Semi-supervised Learning
MIT License
529 stars 83 forks source link

About batch size on CIFAR #42

Closed kai-wen-yang closed 2 years ago

kai-wen-yang commented 2 years ago

Paper said that the batch size for CIFAR is 128, but the code initialize batch size with 64:

https://github.com/LiJunnan1992/DivideMix/blob/d9d3058fa69a952463b896f84730378cdee6ec39/Train_cifar.py#L17

Since there are no scripts provided. I am confused, about whether 128 means two 64 batch augmentation? Should I set --batch_size 128 when I train on CIFAR?

LiJunnan1992 commented 2 years ago

Hi, you can use batch_size=64, because each batch contains 64 labeled samples and 64 unlabeled samples, which makes the total size 128. Thanks

kai-wen-yang commented 2 years ago

Thanks for the reply.