MIC-DKFZ / nnUNet

Apache License 2.0
5.42k stars 1.65k forks source link

Generating Scribble annotations #2124

Open GivralNguyen opened 2 months ago

GivralNguyen commented 2 months ago

Hi, i recently read into your new work "Embarrassingly Simple Scribble Supervision for 3D Medical Segmentation". I saw that partial loss is already implemented in nnUnet (https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/ignore_label.md). Do you also have the code for the scribble generation from dense reference segmentations? Thanks for your great work!

FabianIsensee commented 2 months ago

Hey, we are planning to release this code upon acceptance of the paper. I am also tagging @Karol-G here so that he is aware of this issue

GivralNguyen commented 2 months ago

Thank you for your answer. Looking forward to your paper's acceptance! Also one more question: Can i train data that has been densely labeled together with data that are sparsely labeled? i.e we already have some densely labeled data and we want to sparsely label some other data, and train the model with both types of data. As long as i change dataset.json to include ignore label it should be fine right?

Karol-G commented 2 months ago

Most likely yes, but of course only the final test set performance will show :) Do not forget to also exchange the background in the sparsely labeled data with the ignore label and to additionally provide some sparse labels for the background (in case that is not already the case).

FabianIsensee commented 2 months ago

From a technical perspective this should work. From a practical perspective one could think about how training data is sampled. Currently we sample cases randomly, regardless of how densely they are labeled. This means that scribbled cases will have the same importance in training as densely labeled cases. Whether that is a good thing or not I don't know

GivralNguyen commented 1 month ago

We will test it out and let you guys know how it goes

FabianIsensee commented 1 month ago

How did it go?