Closed BruceYu-Bit closed 3 years ago
On a single machine, I simply used model = torch.nn.DataParallel(model)
, and it works. But this doesn't scale very well.
Thanks! Now i met a new question, i wonder a pretain model with smaller patch size,like 4 or 8. Can u provide?
------------------ 原始邮件 ------------------ 发件人: "rstrudel/segmenter" @.>; 发送时间: 2021年7月28日(星期三) 下午2:52 @.>; @.**@.>; 主题: Re: [rstrudel/segmenter] How to train with multi gpu (#13)
On a single machine, I simply used model = torch.nn.DataParallel(model), and it works. But this doesn't scale very well.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
About 8x8 patches please see #3 . The multi GPU training should work out of the box with slurm but maybe you have another usecase which does not work as is. I am closing this issue but feel free to reopen if you still meet issues with training.
I wonder how to train with multi-gpu