facebook / Ax

Adaptive Experimentation Platform
https://ax.dev
MIT License
2.36k stars 307 forks source link

MultiObjective optimization with missing values #1579

Closed Kh-im closed 1 year ago

Kh-im commented 1 year ago

Hello,

I'm trying to do a multiobjective (2 objectives) optimization but in my data some values are missing for the objective 2. I don't want to delete the data where there is only the objective 1 with information. Can I do something about it ?

Thanks for your help

Balandat commented 1 year ago

Which API are you using? In general Ax supports multi-objective optimization even when the data of the objectives is observed at different points, certainly via the developer API. There may be some limitation w/ the Service API right now, if there is we need to fix that.

Kh-im commented 1 year ago

I'm using Service API but even with developer API, sorry, but that's not clear to me how you can "write" different data points for each objective in the same trial.

Kh-im commented 1 year ago

I tried to do it step by step between botorch & Ax. I can't go further than fit_gpytorch_mll at the first step. If you have two objectives, evaluated on the same dataset (with 9 points for example) and you miss the last three values from objective 2. You can't pass fit_gpytorch_mll at the first step because in torch\distributions\multivariate_normal, it will say :

209 if self._validate_args:
210     self._validate_sample(value)

--> 211 diff = value - self.loc 212 M = _batch_mahalanobis(self._unbroadcasted_scale_tril, diff) 213 half_log_det = self._unbroadcasted_scale_tril.diagonal(dim1=-2, dim2=-1).log().sum(-1)

RuntimeError: The size of tensor a (6) must match the size of tensor b (9) at non-singleton dimension 0`

sdaulton commented 1 year ago

I just tested this using the MOO tutorial that uses the service API: https://github.com/facebook/Ax/blob/main/tutorials/multiobjective_optimization.ipynb

This works fine if some designs only have evaluations for 1 of the objectives.

Balandat commented 1 year ago

You can't pass fit_gpytorch_mll at the first step because in torch\distributions\multivariate_normal

Do you have a full example for this? I'm not entirely sure what code you're running so it's hard to diagnose what's going on. One thing to note is that a model such as SingleTaskGP does allow multiple outputs, but requires the same number of training observations for all outputs. If that is not the case (e.g. in your setting) then you'd have to use multiple individual models; you can combine those together in a ModelListGP wrapper to have the same multi-output API. Ax does this under the hood (which is why Sam's setting above works - @sdaulton could you share the modified example so @Kh-im sees exactly what code you ran?).

sdaulton commented 1 year ago

Yeah here it is: multiobjective_optimization.ipynb.txt

Kh-im commented 1 year ago

Thanks a lot, it's working perfectly with Service API. The example is very clear!