Open malihamashkoor123 opened 1 year ago
Hi @malihamashkoor123,
Can you tell more about your use case where you think Bayesian Optimization (BO) would help over grid search?
This is something that me and @jreps have been discussing but mostly for deep learning models where we can't afford to search through all combinations because it is so computationally expensive.
This is something I'm definitely interested in but adding it could be nontrivial. Currently the enumeration of the hyperparameter search space is done all upfront in each modelSettings function, either using grid search or random search. But for bayesian optimization that wouldn't work so the main training loop would need to be refactored to accommodate that.
There is also the question of which specific algorithm to use, is there a clear state-of-the art one? Are there accessible implementations through R packages available? Implementing such an algorithm from scratch is as well non-trivial.
If you have any experience in using BO I'd be interested in hearing about it.
Hi @egillax ,
Thank you for your reply.
I don't have any experience with the BO. However, as BO is an informed search method and is generally preferred over the grid search, therefore, I was mainly curious about how to implement it together with the other models (e.g Random Forest) in the PLP pipeline, and whether if BO's implementation can impart any difference in the performance of the models.
Additionally, considering the PLP pipeline and the complexity of the BO (as compared to the grid and the random search) I also believe that implementing the BO from the scratch in the PLP pipeline would be a better option as compared to using the available R-packages.
Is there a way to implement the Bayesian Optimization method of hyperparameter tuning in the (R-based) PLP pipeline?