Project-MONAI / tutorials

MONAI Tutorials
https://monai.io/started.html
Apache License 2.0
1.84k stars 679 forks source link

Explicit for-loop optimisation or SupervisedTrainer #16

Open rijobro opened 4 years ago

rijobro commented 4 years ago

It seems that in the majority of tutorials, the optimisation for loop is given explicitly. In relatively few places, the SupervisedTrainer is used, despite existing for this reason.

I can see why having the explicit for loop is beneficial for tutorials - so that people are more aware of the inner workings. However, for the sake of conciseness, I would be in favour of having just one notebook (named suitably) in which the explicit for loop is given, and then from there on, using the SupervisedTrainer. Notebooks using SupervisedTrainer could then refer to the explicit notebook.

I think @ericspod is in favour of leaving the notebooks as they are, so as not to hide anything (which I understand). Anyone else have an opinion?

rijobro commented 4 years ago

Example of notebook with explicit for loop: https://github.com/Project-MONAI/Tutorials/blob/master/brats_segmentation_3d.ipynb Example of notebook with SupervisedTrainer: https://github.com/Project-MONAI/Tutorials/blob/master/models_ensemble.ipynb

wyli commented 4 years ago

Thanks for bringing this up! Earlier we had the feedback from the users saying that they don't want to spend time to learn yet another framework (referring to the monai.engines and monai.handlers mechanism), we therefore create those explicit forloop examples to show that many of the monai components work well in a 'vanilla' pytorch script. Now we might have too many such examples. Perhaps we could address this after properly rearrange the examples https://github.com/Project-MONAI/Tutorials/issues/2 what do you think @Nic-Ma?

Nic-Ma commented 4 years ago

Sounds reasonable! We should rearrange examples first then let's discuss the tutorials. Thanks.