MIC-DKFZ / nnUNet

Apache License 2.0
5.83k stars 1.75k forks source link

Multi-GPU nnUNet training #2482

Open jhdezr1 opened 1 month ago

jhdezr1 commented 1 month ago

Hi there!

I am trying to train a 3d_fullres model, but the patch size, despite maximizing the memory occupied in one of the GPUs, is too small. Hence, I would like to try multi-GPU training but I cannot find the argument in nnUNetv2_plan_and_preprocess to configure it that way.

I would appreciate some help!!

Thank you!

aymuos15 commented 1 month ago

https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/how_to_use_nnunet.md#using-multiple-gpus-for-training

You would need ddp I guess so; nnUNetv2_train DATASET_NAME_OR_ID 2d 0 [--npz] -num_gpus X

sten2lu commented 1 month ago

Hi @jhdezr1, The simplest way to do so is by adding a custom plans file and using these plans for training.

You can do so by writing a custom planner for this, which is detailed in the general information about plans files (see below).

Adding a custom plan to the plans.json file inside the nnUNet_preprocessed folder and training a model with these plans is also shown exemplary here: https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/competitions/AutoPETII.md

For a general explanation of the plans files, see here: https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/explanation_plans_files.md

Best regards, Carsten