neuropoly / idea-projects

Ideas for cool projects
1 stars 0 forks source link

Consider adding ensembling to existing models #13

Closed joshuacwnewton closed 1 year ago

joshuacwnewton commented 1 year ago

NB: This issue has been migrated from the spinalcordtoolbox repository: https://github.com/spinalcordtoolbox/spinalcordtoolbox/issues/1893

As per original author @perone:

(I'm just adding this idea for the future, just to not forget about it, since it involves almost all machine learning models we use.)

There are some things that can be done to improve Machine Learning/Deep Learning results by trading performance (resources usage), and ensembling is one of them. I think we should consider adding an option for using ensembling during inference time as an optional parameter, so if users don't have a performance requirement (which is common, for instance, when you want to segment just a few volumes), they can get better results by waiting a little more for the inference. Now, this requires training more models and shipping these models as well, so it will by effect increase download time of models during installation.

joshuacwnewton commented 1 year ago

I've created this issue from a (very old) issue in SCT's repo, simply because the idea involves retraining models, which feels like a good candidate for "internship project" ideas.

But, it's possible we may just want to shelve the idea entirely? It was listed as a low-priority issue anyway. :thinking:

jcohenadad commented 1 year ago

But, it's possible we may just want to shelve the idea entirely? It was listed as a low-priority issue anyway.

agreed-- let's shelve it