Closed delfosseaurelien closed 3 years ago
allow the use of torch multi-gpu for inference use of nn.DataParallel or ray to make the inference on multiple batches
I believe nn.DataParallele is the recommended solution considering the native support of mixed precision https://pytorch.org/docs/stable/notes/amp_examples.html#working-with-multiple-gpus
nn.DataParallele
allow the use of torch multi-gpu for inference use of nn.DataParallel or ray to make the inference on multiple batches