Open JackRio opened 3 weeks ago
I think this has to do with the number of voxels available for training per class. Class 4 has the highest number of voxel in my dataset so maybe it's learning class 4 better. Also, I realized the patch size was too big for the model to learn any foreground class. So I reduce the patch size to 128 x 128 from 384x512. At first the model started to learn all classes but collapsed after 50 epochs to predicting class 4.
Any suggestions in this case what I should do?
I am training 2D nnUNet on medical imaging data with 6 class labels. The dataset consists of CT scans and the labels are segmentation masks for calcium in different arteries. I cross-check my data and labels in the preprocessed directory and run a debugger to see if the
train_step
function gets data with labels from other classes during training. I cannot seem to figure out what exactly is happening because this is what the pseudo-dice looks like. I trained one fold for all 1000 epochs still the model is only learning class 4. I am aware pseudo dice is not done on the entire dataset but patches but other classes have 0 loss throughout.Let me know if more information is needed. I will debug a bit more and see if there is anything wrong somewhere. Note: I am using out-of-the-box nnUNet for now and haven't changed anything.