Closed snpc94 closed 7 years ago
Hi, actually MTBO is implemented in RoBO. Checkout https://github.com/automl/RoBO/blob/master/robo/fmin/mtbo.py
cheers Aaron
Thank you for reply!
And I want to ask other question. Is this repository implemented "stochastic gradient descent" ?
No this repository implements Bayesian optimization for black box optimization where no gradient Informationen are available. For implementation of stochastic gradient methods have a look on the Lasagne, Keras or the tensorflow packages
I see. I'll try to use such a packages.
I want to watch "accuracy" to judge optimizations. But I think there is no implements "accuracy". Where is implemented it? If it isn't implemented yet, could you teach me the abstract of implements it?
I am not sure what you mean with accuracy? The validation accuracy of your machine learning algorithm that you want to optimize? RoBO returns the best found configuration / function value after each iteration and one can use that to compare it to different optimizers. Maybe have a look at the following overview paper to get a better understanding how Bayesian optimization works: https://www.cs.ox.ac.uk/people/nando.defreitas/publications/BayesOptLoop.pdf
cheers Aaron
Nice to meet you. I'm a student of the Department of Computer Science and System Engineering, Kobe University. I can't write English very well, so please forgive me about grammar's wrong and misunderstand word and so on.
I'll try to use this repository to research about multi-task Bayesian Optimization. I want to implement logistic-regression experiment written by "Multi-Task Bayesian Optimization (Swersky et al. NIPS 2013)". But I think that this repository doesn't implement multi-task Gaussian Process. So I don't know which model uses for this experiment. Would you like to teach me where implement multi-task Gaussian Process and how to use this?
And I don't know which acquisition function uses correct "information_gain.py" or "information_gain_per_unit_cost.py" each of single-task and multi-task written in this paper. Would you like to teach me about it too?