NifTK / NiftyNet

[unmaintained] An open-source convolutional neural networks platform for research in medical image analysis and image-guided therapy
http://niftynet.io
Apache License 2.0
1.37k stars 404 forks source link

Brats demo not training successfully #78

Closed ba1441 closed 6 years ago

ba1441 commented 6 years ago

I'm trying to train the Brats network from scratch by following the demo here : https://cmiclab.cs.ucl.ac.uk/CMIC/NiftyNet/tree/dev/demos/BRATS17

I have BRATS 2017 data, which I acquired from the legitimate source. The only thing I have changed from the default files is specifying the paths (path_to_search, model_dir and histogram_ref_file) in the train_whole_tumor_sagittal.ini file. The rest of the configuration has been left the same.

I preprocessed the data using the rename_crop_BRATS.py script that is included in the demo. The only thing I modified here was again changing the path variable so that it points to my data.

I trained the model for 20,000 iterations, using the train command specified on the demo page. Then I graphed the dice loss for the training set, and this is what I got.

traininglog

I've made sure that I haven't modified any code from the demo, so I'm unsure what is causing this issue. Thanks

wyli commented 6 years ago

Hi @ba1441 could you please follow the model zoo version instead? https://cmiclab.cs.ucl.ac.uk/CMIC/NiftyNetExampleServer/blob/master/anisotropic_nets_brats_challenge_model_zoo.md (you can still use the dataset you've prepared, but change the config and command to the model zoo ones, so it should be straightforward.)

Also the training loss curve can be noisy as it's computed from image window based stochastic samples without smoothing, have you checked the volume predictions?

ba1441 commented 6 years ago

After making us of the average_volume script in the model zoo, I can see that the model is actually training successfully! I was just interpreting the training loss incorrectly as I wasn't considering the window based sampling. Thanks a lot for your time and for your help