Open mobiuscreek opened 3 years ago
It would be very good to make time to explore this properly. I recall we were looking into https://github.com/hyperopt/hyperopt for parameter tuning during the Building Stones project. Talos looks quite Keras-specific? This thread on [https://forums.fast.ai/t/what-is-the-de-facto-library-to-use-as-black-box-for-hyperparameter-tuning/44338/7](fastai forums) about recommended libraries mentions hyperopt and links to some complementary approaches including https://github.com/dragonfly/dragonfly for reducing compute waste, potentially.
And we must remember to cover as many angles in [https://www.jeremyjordan.me/testing-ml/](model testing) as we can before moving onto this. I suppose how we do this will affect how we do #5 as well - it might start to be better to supply a source-controlled config file rather than expand the commandline arguments?
Yes, parsing a config file will be better for source control. One other potential optimization library is optuna: https://github.com/optuna/optuna but we should have better test coverage before we move into this.
Libraries like talos add an automated hyperameter optimization layer. We should implement this (or any other similar libraries) and add it to the training workflow.