SHI-Labs / Semi-Supervised-Transfer-Learning

[CVPR 2021] Adaptive Consistency Regularization for Semi-Supervised Transfer Learning
http://arxiv.org/abs/2103.02193
MIT License
101 stars 15 forks source link

GPU requirements and training time #7

Closed dongfengxijian closed 3 years ago

dongfengxijian commented 3 years ago

How many GPUs and how long I need to train the framework? Thanks

Walleclipse commented 3 years ago

Hi, The default CIFAR-10 experiments need ~5.3 GB GPU memory, and the default CUB-200 experiments need ~26.3 GB GPU memory (because of the bigger model and bigger image size). For the experiments, two Tesla V100 G cards are enough. For training from the pre-trained model, the experiments only take ~3 hours. But for training from scratch, it may take more than 10 hours to get better results.

dongfengxijian commented 3 years ago

@Walleclipse Thank you for your replying!