BritishGeologicalSurvey / CoreScore

Assessing core fragmentation from BGS Core Store photography
GNU Lesser General Public License v3.0
6 stars 1 forks source link

Add parameter optimization #6

Open mobiuscreek opened 3 years ago

mobiuscreek commented 3 years ago

Libraries like talos add an automated hyperameter optimization layer. We should implement this (or any other similar libraries) and add it to the training workflow.

metazool commented 3 years ago

It would be very good to make time to explore this properly. I recall we were looking into https://github.com/hyperopt/hyperopt for parameter tuning during the Building Stones project. Talos looks quite Keras-specific? This thread on [https://forums.fast.ai/t/what-is-the-de-facto-library-to-use-as-black-box-for-hyperparameter-tuning/44338/7](fastai forums) about recommended libraries mentions hyperopt and links to some complementary approaches including https://github.com/dragonfly/dragonfly for reducing compute waste, potentially.

And we must remember to cover as many angles in [https://www.jeremyjordan.me/testing-ml/](model testing) as we can before moving onto this. I suppose how we do this will affect how we do #5 as well - it might start to be better to supply a source-controlled config file rather than expand the commandline arguments?

mobiuscreek commented 3 years ago

Yes, parsing a config file will be better for source control. One other potential optimization library is optuna: https://github.com/optuna/optuna but we should have better test coverage before we move into this.