athn-nik / teach

Official PyTorch implementation of the paper "TEACH: Temporal Action Compositions for 3D Humans"
https://teach.is.tue.mpg.de
Other
383 stars 40 forks source link

Training on multiple GPUs #11

Closed chinnusai25 closed 1 year ago

chinnusai25 commented 1 year ago

Hey, currently the training is being performed on single gpu, what are the changes to do so as to train it on multiple gpus (let's say 4). (Actually I am kinda new to hydra so I am not sure how to adapt this code onto multiple gpus)

athn-nik commented 1 year ago

Hello, actually with the current setup not many things should be needed. Just setting those flags in the trainer/base.yaml: devices=N accelerator=gpu strategy=ddp However, when I tried it, it didn't work out-of-the-box. There were some sync issues with the metrics. I am trying to make it work for my next project. You can give it a try by setting those arguments and ping me back here to help you resolve any bugs you may encounter. I don't plan to release a multi-GPU version for this but happy to help and integrate any changes that would make multi-gpu training possible.

athn-nik commented 1 year ago

I am closing this for now feel free to re-open if you want any additional help.