Open bertiqwerty opened 1 year ago
When doing so plz consider this branch: https://github.com/experimental-design/bofire/tree/feature/random_forest and this PR in botorch https://github.com/pytorch/botorch/pull/1636. I hope that the botorch PR is finished before the hackathon but no guarantee. This PR will also then enable deep nn ensebmle.
Will first implement a random forest model and then a strategy that uses it along with random sampling for the optimization. Generalize in the next step to arbitrary models
@jduerholt this has been lying around too long and getting stale so I want to get it done. What we did at the hackathon needs heavy modification due to changes in main. I want to do this but a few pointers from you would be great. What do you think of the following:
so the main points would be implementing min_distance, uncertainty, and ask, for which the code is essentially already there.
How do you see this? Am I missing something and what are the steps you would otherwise take?
Sounds fine for me! How would the uncertainty enter the acqf?
To start with we will just have UCB fixed for this strategy. But I guess there's no reason we have to limit it (any callable that takes a mean and stdev in and outputs an acqf value - no gradients required - would be fine for this)
Uncertainty based on distance together with random sampling for optimization is used in basf/mbo's Random Forest implementation. This can be applied to any model that does not have uncertainty predictions and might be difficult to optimize.