facebook / Ax

Adaptive Experimentation Platform
https://ax.dev
MIT License
2.38k stars 312 forks source link

[GENERAL SUPPORT]: Getting extremely slow in multi-objective optimization #2859

Closed howardwang1997 closed 1 month ago

howardwang1997 commented 1 month ago

Question

Hi, I am trying to do a multi-objective optimization with AxClient. I am optimizing 2 objectives with 513 parameters. I basically adopted the default setting and firstly generated ~1000 samples with SOBOL, when it switch to BOTORCH, it gets extremely slow. I tried to specify ApproximateGPyTorchModel by GenerationStrategy(steps=[GenerationSteps(surrogate=Surrogate(ApproximateGPyTorchModel))]) when defining AxClient, it raises Original error: : RuntimeError: All MarginalLogLikelihood objects must be given a GP object as a model. If you are using a more complicated model involving a GP, pass the underlying GP object as the model, not a full PyTorch module.

Is there any advice to accelerate this generation with BoTorch? Thank you very much.

Please provide any relevant code snippet if applicable.

No response

Code of Conduct

Balandat commented 1 month ago

It's not surprising that things are slow, we use hypervolume computations to perform multi-objective optimization which can get very expensive if the Pareto Frontier is very complex, which can well be the case if you already have 1000 initial data points.

How expensive is it for you to generate a data point? If you can easily evaluate 1000 points then maybe multi-objective BO may not be the best approach for your problem.

howardwang1997 commented 1 month ago

Thank you for the reply, that makes sense. It is very cheap to generate and evaluate data points, then maybe I should try single-objective BO for each objectives? Is this expected to be faster?

Balandat commented 1 month ago

Single-objective BO for different objectives will be faster, but if you do need to solve an actual multi-objective problem it may not be the right problem to solve.

Bayesian Optimization has a lot of compute overhead relative to other methods, so if evaluating the function(s) you need to optimize is very cheap, then it may not be the best fit. Have you looked into other alternatives? Evolutionary algorithms such as NSGA-II are a common choice for multi-objective optimization if evaluations are cheap. There are a number of python packages out there that provide these algorithms, e.g. pymoo.

howardwang1997 commented 1 month ago

Thank you for the advice! I am also looking into evolutionary algorithms.