The bayesmark package is another wrapper hyper parameter tuning library. We can add this to our benchmarking suite. Per their documentation, they wrap around:
The builtin optimizers are wrappers on the following projects:
HyperOpt
Nevergrad
OpenTuner
PySOT
Scikit-optimize
The
bayesmark
package is anotherwrapper
hyper parameter tuning library. We can add this to our benchmarking suite. Per their documentation, they wrap around:https://github.com/uber/bayesmark/
And we already benchmark against
HyperOpt
. Note thatOpenTuner
is a previous package developed at MIT in 2014.