Open tb2-sy opened 1 year ago
no special reason, I just didn't test it out with multiple gpus. if you look at the code there is already some logic to handle distributed training (ie: there's a DistributedSampler being generated), but I haven't really tried it out myself (PRs for any potential fixes needed are welcome!)
no special reason, I just didn't test it out with multiple gpus. if you look at the code there is already some logic to handle distributed training (ie: there's a DistributedSampler being generated), but I haven't really tried it out myself (PRs for any potential fixes needed are welcome!)
Thanks for your reply. But the code is too complicated for me, can you please provide the multi-gpu traing script, it is very important to me, thank you.
Hi, Thank you for sharing the code. I noticed that there is no code for multi-gpu training in the training code, but evaluation can support multiple gpus. Is there any special reason for this?