Open EhabWilson opened 1 week ago
It is common to use dataloader with distributed sampler when training with multi gpus. So why not use a distributed sampler in examples/simple_trainer.py, is that for any reason?
examples/simple_trainer.py
I'm not quite familiar with DistributedSampler. What's the benefit of using that?
DistributedSampler
It is common to use dataloader with distributed sampler when training with multi gpus. So why not use a distributed sampler in
examples/simple_trainer.py
, is that for any reason?