yxgeee / MMT

[ICLR-2020] Mutual Mean-Teaching: Pseudo Label Refinery for Unsupervised Domain Adaptation on Person Re-identification.
https://yxgeee.github.io/projects/mmt
MIT License
472 stars 73 forks source link

About GPUs and batch size #41

Closed CCA8290 closed 3 years ago

CCA8290 commented 3 years ago

HELLO there,you mentioned that 16 images per GPU is better,so when I only use one GPU for training, how to set batch and instance?