lhoyer / MIC

[CVPR23] Official Implementation of MIC: Masked Image Consistency for Context-Enhanced Domain Adaptation
268 stars 40 forks source link

Questions about distributed training #46

Closed RyanQR closed 1 year ago

RyanQR commented 1 year ago

I noticed that your project only uses one GPU. Can your code be trained distributed?

Victory8858 commented 1 year ago

I also have the same question, thank you!

lhoyer commented 1 year ago

We only trained MIC on a single GPU and did not use distributed training.

kimkj38 commented 1 year ago

There's distributed option(args.launcher) in the tools/train.py line106. But your code is not providing distributed training?