Open jhdezr1 opened 1 month ago
You would need ddp I guess so; nnUNetv2_train DATASET_NAME_OR_ID 2d 0 [--npz] -num_gpus X
Hi @jhdezr1, The simplest way to do so is by adding a custom plans file and using these plans for training.
You can do so by writing a custom planner for this, which is detailed in the general information about plans files (see below).
Adding a custom plan to the plans.json file inside the nnUNet_preprocessed folder and training a model with these plans is also shown exemplary here: https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/competitions/AutoPETII.md
For a general explanation of the plans files, see here: https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/explanation_plans_files.md
Best regards, Carsten
Hi there!
I am trying to train a 3d_fullres model, but the patch size, despite maximizing the memory occupied in one of the GPUs, is too small. Hence, I would like to try multi-GPU training but I cannot find the argument in nnUNetv2_plan_and_preprocess to configure it that way.
I would appreciate some help!!
Thank you!