MIC-DKFZ / nnUNet

Apache License 2.0
5.87k stars 1.76k forks source link

nnUNet configuration. #2444

Open Malitha123 opened 2 months ago

Malitha123 commented 2 months ago

I've been working on MRI segmentation tasks using nnU-Net, and I've noticed that the standard configurations often utilize SGD as the optimizer. While I understand that the choice of optimizer and normalization technique can depend on the specific dataset, I'm curious if there's a significant difference in performance when using the Adam optimizer instead of SGD.

Similarly, I'd like to know if there's any observed difference in segmentation performance when using BatchNorm compared to InstanceNorm within nnU-Net.

Any insights or comparative studies on these aspects would be greatly appreciated.

Lars-Kraemer commented 2 months ago

Hey @Malitha123,

As you said, both are heavily dependent on the specific dataset and what works best needs to be experimented with. If in doubt, however, I recommend going with the nnUNet defaults.

Best, Lars

Malitha123 commented 2 months ago

Thank you @Lars-Kraemer for the input. @Lars-Kraemer is it same for the AdamW as well compared to the SGD and Adam?