autonomio / talos

Hyperparameter Experiments with TensorFlow and Keras
https://autonom.io
MIT License
1.63k stars 268 forks source link

Abstract interface to optimisation (suggestion) #45

Closed awhillas closed 5 years ago

awhillas commented 6 years ago

Have you guys looked at Bayesian Optimisation? There is already a library you could integrate with https://github.com/shibuiwilliam/keras_gpyopt

I assume your just doing grid search but if you abstracted out the optimiser to an interface, so one could choose Generic Algorithms or Bayes or X, Talos could become the Keras of hyper-parameter tuning :)

mikkokotila commented 6 years ago

Thanks a lot @awhillas. Your thinking sits very well with the roadmap. I will look into this and see how it could be best integrated. Looks like implementing GryOpt into Keras workflow is straightforward.

matthewcarbone commented 6 years ago

@awhillas That looks really promising. Thank you for sharing this! By all means if you have any other insights or suggestions we'd be happy to hear. 👍

mikkokotila commented 6 years ago

I think generally it would be very useful to create a list of options where we can use as input the experiment log format where we have an objective metric together with a hyperparamater configuration of n parameters. The current architecture makes it very easy to (within set intervals) analyze what is already known as a fact and then use that to optimize the remainder of the experiment, and keep doing as the experiment progress.

This is connected with #40

matthewcarbone commented 6 years ago

@mikkokotila I think I'm going to look into this soon. This might be a reasonable intermediate step between what we have now and evolutionary algorithms. Those usually require some heavy firepower hardware wise from what I know. @awhillas, I am not so familiar with Bayesian optimization. Is it less expensive? Any input is welcome!

mikkokotila commented 5 years ago

To add some context here. We have several ways to "reduce" the permutation space:

When the mechanism for optimization kicks in, what happens is that some arbitrary process (i.e. some optimization function) works to identify a number of ids and then those ids are either kept or dropped (in the permutation space).

That is the design pattern of Talos optimization. So anything that can feed into that process can be plugged in seamlessly.

reinert commented 5 years ago

Hi, I'm interested in Bayesian Optimisation. Any plans to integrate it here?

mikkokotila commented 5 years ago

v.0.6 will introduce a new architecture related with optimization strategies, where in a single file a new optimizer can be introduced. The criteria is that it must accept as input the results (the csv that is being updated each permutation) and as output it must return a parameter label (e.g. 'batch_size') and parameter value (e.g. '128').

reinert commented 5 years ago

Can you post a concrete example? I'm already using not yet documented 0.6 branch