HIPS / Spearmint

Spearmint Bayesian optimization codebase
Other
1.55k stars 329 forks source link

How is optimization of acquisition function over integer/categorical parameters done? #63

Open iaroslav-ai opened 8 years ago

iaroslav-ai commented 8 years ago

Do you restrict search in the acquisition function only over feasible (discrete) values? As a gradient descent is at least not obvious to apply in this case, I would imagine one can use a combination of gradient descent and some smart enumeration of discrete parameters. If not, do you do some sort or relaxation over discrete variables? I would greatly appreciate if you would drop a short message on how specifically you optimize, so that I can understand what I should expect in terms of scalability if I have many discrete variables.

heejincs commented 8 years ago

I was wondering about the same problem. Is there an example how to do BO with discrete features? Did you figure it out? It would be great if anyone can help. Thank you.

ghost commented 7 years ago

@heejincs Any luck figuring this out?

jferstad commented 6 years ago

I'd also like to know!

danielhernandezlobato commented 6 years ago

All variables are mapped to the [0,1] box. Then, these parameters are: