mdabros / SharpLearning

Machine learning for C# .Net
MIT License
384 stars 85 forks source link

Adds parellelism to Bayesian optimizer by default and adds support for non-deterministic algorithms (release postponed, so no version incrementation) #119

Closed jameschch closed 5 years ago

jameschch commented 5 years ago

I've had a second look at the Bayesian Optimizer and have introduced parallelism by default as with other optimizers. I've also introduced support for non-deterministic algorithms that may return different results for identical parameters. This also entailed a change to the standard serial behaviour so that instead of skipping an evaluation if the parameters did not change from the previous run, it will now store the results for all evaluations and skip any that have been run before. This should result in a performance improvement for the serial behaviour but will consume some memory.

mdabros commented 5 years ago

@jameschch Thanks for the contribution, I will do some local testing of the added parellelism, and release the new version during the weekend.