perone / medicaltorch

A medical imaging framework for Pytorch
http://medicaltorch.readthedocs.io
Apache License 2.0
844 stars 126 forks source link

SCGM example in tutorial #12

Open udion opened 5 years ago

udion commented 5 years ago

Hi,

I think there might be some flaw in the example uploaded, please correct me if I'm wrong.

According to SCGM challenge, we had to segment the GREY MATTER. Here is a snapshot of what training data and the labels look like (this is in ITKSnap a viewer for medical images, I have loaded the site1-sc01-image.nii.gz from the training set with the corresponding mask site1-sc01-mask-r1.nii.gz)

image

My understanding is that the grey matter is the red label stuff in the above image, so my trained model should take a slice (or group of slice) as input and output a mask which only highlights red rigion like stuff, however after few epochs (25 epochs) of running the code given in the example we get something like

input1 gt1 pred1

My doubt being why is the ground truth label a white blob-like stuff, should it not be very narrow and tiny like the red structure in the above ITKSnap image (and hence the prediction should also be something very fine not a blob-like structure)

What is being considered as ground truth in the code?

Please clarify.

Thank you so much for this amazing effort!!

perone commented 5 years ago

Hi @udion, thanks for the feedback, I pre-processed the dataset before to remove the WM mask from the labels and I completely forgot to mention. I'll add soon a filter to do that in medicaltorch to avoid the pre-processing work, but meanwhile you can just remove the WM mask and leave the GM mask only on the label files.

udion commented 5 years ago

@perone

I see that all the scans consist of 3 SLICES (am I right?)

and if that is the case then does slice_axis=2 mean that it will only train and test using axis=2?

and if that is the case does that mean that the example given in documentation ignores the other 2 slices and hence we may have to train other models with different slice axis (or a different model which processes 3 slices together)?

image

perone commented 5 years ago

Hi @udion, slice_axis=2 means that it will slice the volume on the 2 dimension, in this case on the axial dimension. It doesn't mean it ignores the other axis, just that it is using one axis to train. There are many approaches to train, including the use of 3D kernels, however the model I made available is a 2d slice-wise model.

udion commented 5 years ago

@perone

Few questions about the example code..

1) Original nifti scans and labels have the shape 512 X 512 X 12 you are using mt_transforms.CenterCrop2D((200, 200)) this means the tensor will consider only 200 X 200 region in the frame, right? if so then this assumes that in all the frames the required region will be inside a central region of 200 X 200 right?

2) Since there are 4 labels available for every scan, I am assuming you are using the mean label to train the network?

PS: I was able to get reasonable GM output, once I separated the WM and GM labels

udion commented 5 years ago

@perone

I wanted to customize this for my own problem (segmentation of medical RGB image dataset, it's not MRI/CT) I was able to change the dataloader and model appropriately and my model is training, the task is still the same that the model should output a binary mast, so I am using the same dice loss setup which you have in the example, but my training and validation dice losses are negative

Any clues/pointers what might be wrong?

perone commented 5 years ago

Hi @udion, regarding the questions related to the cropping and 4 labels per rater, take a look on the article from Nature Scientific Reports called "Spinal cord gray matter segmentation using deep dilated convolutions" where we describe our steps and details about it.

Regarding your validation dice being negative, it is supposed to be like that, take a look at this line here, we minimize the negative Dice. Always plot some samples of the predictions to check it as well, it's easier to see what is going on on the network.