I am exploring self-supervised pretraininig for nnU-Net. For that, I do encoder-only pretraining and then transfering the learned encoder weights to the final full U-Net architecture on the finetuning.
With this little adjustment, the workflow for that is quite simple.
I follow the guide of nnU-Net for pretraining and finetuning of nnU-Net
I edit the plans.json file for the pretraining configuration and change the network architecture to the PlainConvEncoder. With the kwargs that I add with this PR, the additional decoder configuration used for the finetuning gets simply ignored, no more plans file editing.
When finished, I plan to do a PR on nnU-Net for the self-supervised learning, if you are interested in that :)
With the nnU-Net and its plans files, everything is nice and configurable. The additional kwargs allow me to quickly only initialize the encoder in nnUNetTrainers allowing me the framework for self-supervised pretraining which is all about training the feature-extractor (encoder) with pseudo-supervised tasks.
Hello, thanks for the great work.
I am exploring self-supervised pretraininig for nnU-Net. For that, I do encoder-only pretraining and then transfering the learned encoder weights to the final full U-Net architecture on the finetuning.
With this little adjustment, the workflow for that is quite simple.
plans.json
file for the pretraining configuration and change the network architecture to thePlainConvEncoder
. With the kwargs that I add with this PR, the additional decoder configuration used for the finetuning gets simply ignored, no more plans file editing.When finished, I plan to do a PR on nnU-Net for the self-supervised learning, if you are interested in that :)
With the nnU-Net and its plans files, everything is nice and configurable. The additional kwargs allow me to quickly only initialize the encoder in nnUNetTrainers allowing me the framework for self-supervised pretraining which is all about training the feature-extractor (encoder) with pseudo-supervised tasks.
Let me know what you think of this