mdabros / SharpLearning

Machine learning for C# .Net
MIT License
384 stars 85 forks source link

0.31.7.0: Refactor BaysianOptimizer and add parallel computation #128

Closed mdabros closed 5 years ago

mdabros commented 5 years ago

This pull request refactors the BayesianOptimizer implementation to be use the same principles as the SMACOptimizer. The two optimizers are both model based optimizers, and should therefore be very similar in implementation. The BayesianOptimizer can be viewed a basic implementation of model based optimization, which the SMACOptimizer builds a few tricks on top of. A base class for model based optimizers seems to be the next logical step, but that will follow in a later pull request.

The refactoring enables use of the BayesianOptimizer in an "open loop" style just like the SMACOptimizer. See the unit tests for an example.

This pull request also adds the option of parallel computation to the BayesianOptimizer. This work was originally added in #119.

Note that when running in parallel, and using the Optimize(Func<double[], OptimizerResult> functionToMinimize) method, the order of the results will not be reproducible. The individuel results will remain the same, but the order of the results will vary between runs.

I recommend only using the parallel version if the provided functionToMinimize is running serial computation, and is slow to compute.

mdabros commented 5 years ago

@jameschch This releases the parallel version of the BaysianOptimizer. I have refactored the code quite a lot to have it more similar to the SMACOptimizer which uses the same principles, and had a more clean implementation.

I ended up cutting the allowMultipleEvaluations feature since I still cannot see a good use for it. At least I need some more convincing before I will consider adding it to a release :-).