Closed kai-wen-yang closed 2 years ago
Paper said that the batch size for CIFAR is 128, but the code initialize batch size with 64:
https://github.com/LiJunnan1992/DivideMix/blob/d9d3058fa69a952463b896f84730378cdee6ec39/Train_cifar.py#L17
Since there are no scripts provided. I am confused, about whether 128 means two 64 batch augmentation? Should I set --batch_size 128 when I train on CIFAR?
Hi, you can use batch_size=64, because each batch contains 64 labeled samples and 64 unlabeled samples, which makes the total size 128. Thanks
Thanks for the reply.
Paper said that the batch size for CIFAR is 128, but the code initialize batch size with 64:
https://github.com/LiJunnan1992/DivideMix/blob/d9d3058fa69a952463b896f84730378cdee6ec39/Train_cifar.py#L17
Since there are no scripts provided. I am confused, about whether 128 means two 64 batch augmentation? Should I set --batch_size 128 when I train on CIFAR?