experimental-design / bofire

Experimental design and (multi-objective) bayesian optimization.
https://experimental-design.github.io/bofire/
BSD 3-Clause "New" or "Revised" License
230 stars 23 forks source link

general uncertainty/optimization wrapper for any model #69

Open bertiqwerty opened 1 year ago

bertiqwerty commented 1 year ago

Uncertainty based on distance together with random sampling for optimization is used in basf/mbo's Random Forest implementation. This can be applied to any model that does not have uncertainty predictions and might be difficult to optimize.

jduerholt commented 1 year ago

When doing so plz consider this branch: https://github.com/experimental-design/bofire/tree/feature/random_forest and this PR in botorch https://github.com/pytorch/botorch/pull/1636. I hope that the botorch PR is finished before the hackathon but no guarantee. This PR will also then enable deep nn ensebmle.

R-M-Lee commented 1 year ago

Will first implement a random forest model and then a strategy that uses it along with random sampling for the optimization. Generalize in the next step to arbitrary models

R-M-Lee commented 1 year ago

@jduerholt this has been lying around too long and getting stale so I want to get it done. What we did at the hackathon needs heavy modification due to changes in main. I want to do this but a few pointers from you would be great. What do you think of the following:

so the main points would be implementing min_distance, uncertainty, and ask, for which the code is essentially already there.

How do you see this? Am I missing something and what are the steps you would otherwise take?

jduerholt commented 1 year ago

Sounds fine for me! How would the uncertainty enter the acqf?

R-M-Lee commented 1 year ago

To start with we will just have UCB fixed for this strategy. But I guess there's no reason we have to limit it (any callable that takes a mean and stdev in and outputs an acqf value - no gradients required - would be fine for this)