Closed justachetan closed 2 months ago
Yes all data goes to all. Its not really intentional. Distributed sampler should be used but I think the effect would be minor. I guess if youre sending entire scannet to each gpu it will be annoyingly heavy. Btw, I just used every 10th frame of scannet iirc (it was a while since I used the code)
Thanks! I am not sure what happens when a non-distributed sampler works with distributed training. I am assuming each replica of the dataloader gets a different seed and the order of samples is different across devices?
I assume so. I never set any seeds, and I havent observed issues relating to repeated samples.
Thank you!
Hi,
I was going through the training code in
experiments/roma_indoor.py
and it seems that you have used a non-distributed sampler (WeightedRandomSampler
) instead ofDistributedSampler
. I believe this means that the entire data will be replicated and passed to each model replica instead of shards of the data. Just wanted to confirm this and ask if this is this is intentional?Thanks!