Closed mrcslws closed 4 years ago
Hi, thanks for the great repro.
(Am I doing something silly?)
No.
Is the limit of 1112 trials known? Is it By Design?
It's a limitation of the quasirandom generator in pytorch we use under the hood. It's not by design - it's an upstream limitation, the sampler only works until dimension 1111. While it's possible to change that, that would be a lot of work, and so far hasn't been a priority.
Taking a step back, the sampler is used for drawing qMC samples from a high-dimensional posterior distribution of a Gaussian Process model, which is an operation performed in the qNoisyExpectedImprovement
acquisition function. The prune_baseline=True
that you mentioned is designed to help speeding up things to only consider those trials that are / could be relevant to optimize further. Looking at the code, it seems that we don't handle the case when the input to that function are a lot of trials. I can put up a PR in botorch to fix this.
Is there a good reason that the multi-objective case has a lower limit? In hyperparameter search you generally need more trials for multi-objective experiments, not fewer.
What's important here is the dimensionality of the distribution from which you sample. Since this distribution is in the outcome space, the dimension of that is q * m
where q
is the number of points considered jointly and m
is the dimensionality of the output. So for the sampler to work you need n * q < 1111
.
I tried setting "acquisition_function_kwargs"
prune_baseline=False
, but the error still eventually occurs on another codepath. So I haven't found a workaround.
Interesting, this shouldn't happen, will also take a look there. In this case you should definitely not need set prune_baseline
to False, that will slow things down significantly.
I'm hitting a hard limit on the number of trials in an experiment. With a single-metric objective this limit is 1112 trials. With a two-metric MultiObjective, the limit is 556. In all these cases, I get error:
This MultiObjective trend will continue, 3-objective models will have a limit of 1112/3 trials, etc.
There are two separate lines of questions here:
Single-objective demo
Result
The Botorch model doesn't work after there are more than 1111 results.
I tried setting "acquisition_function_kwargs"
prune_baseline=False
, but the error still eventually occurs on another codepath. So I haven't found a workaround.