gaudot / SlicerDentalSegmentator

3D Slicer extension for fully-automatic segmentation of CT and CBCT dental volumes.
Apache License 2.0
56 stars 6 forks source link

Request for Assets to Perform Transfer Learning Using nnU-Net #15

Closed drbreathe closed 1 week ago

drbreathe commented 1 month ago

Dear Team,

Thank you for making your work open-source. The community, including myself, has greatly benefited from your contributions.

I am currently using nnU-Net and am interested in applying transfer learning to my target dataset. After reviewing the nnU-Net pretraining and fine-tuning procedure, I realized that I would need more than just the model weights to successfully perform this task.

Could you please provide the following assets to facilitate the transfer learning process?

1. Plans of the Pretraining Dataset: The nnU-Net plans file associated with the pretraining dataset. This file should include details about the network topology, patch size, batch size, and normalization schemes used during training, ensuring compatibility with my target dataset.

2. Configuration Details:

3. Pretraining Dataset Fingerprint (optional): If available, the fingerprint of the pretraining dataset would be helpful in understanding any dataset-specific customizations applied during training.

I appreciate your time and assistance in this matter. The provided assets will greatly support my work and the community in effectively leveraging your network weights. If I am wrong in my approach, please correct me.

Thank you once again for your contribution to the community.

Best regards, Sankarsan

Reference: https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/pretraining_and_finetuning.md

GauthierDot commented 1 month ago

Dear @drbreathe

Thank you for your interest. The plans and dataset JSON files can be found in the dataset zip file shared in our repo release. This should be enough to perform finetuning, as we did not apply any custom training or augmentation.

Please let us know how it goes and share your results with us !

Best regards Gauthier