facebook / Ax

Adaptive Experimentation Platform
https://ax.dev
MIT License
2.36k stars 307 forks source link

[Question] Multiobjective optimization where one target has no optimization direction (but required range) #2404

Closed Abrikosoff closed 3 months ago

Abrikosoff commented 5 months ago

I have a multiobjective optimization scenario where, for one of the targets, I do not have a preferred optimization direction, but I do have a range into which the output values from the optimizer should fall. Previously this situation has been dealt with in an ad hoc manner by simply dropping that particular target, but this doesn't seem correct and not really in the spirit of performing MOBO; also the range constraint cannot be enforced.

In general this situation could be corrected by defining a function that maps all values outside of the required range back into the support of that target, and then setting the optimization direction to be minimization (for example, map function could be defined to be quadratic within the range), but I am unsure where I should put this map function. My idea is to define a custom metric which outputs the evaluation result of this map function, but this would:

  1. require use of the Developer API;
  2. run into the situation where Metrics (in my understanding) are used in way they are not meant to be, i.e., after this mapping the output of the Metric is no longer the direct evaluation result.

Any ideas or suggestions would be really appreciated!

mgarrard commented 3 months ago

@Abrikosoff sorry for the delayed reply here, I would actually add this metric as an outcome constraint instead of an objective -- you can set a double constraint to set the range. In the example below this would be represented by "metric_b".

optimization_config = MultiObjectiveOptimizationConfig(
    objective=MultiObjective(
        objectives=[
            Objective(
                metric=experiment.metrics[
                    “metric_a”
                ],
                minimize=True,
            ),
        ]
    ),
    objective_thresholds=[
        ObjectiveThreshold(
            metric=experiment.metrics[
                 “metric_a”
            ],
            op=ComparisonOp.LEQ,
            bound=0.0,
            relative=True,
        ),
    ],
    outcome_constraints=[
        OutcomeConstraint(
            metric=experiment.metrics[
                “metric_b”
            ],
            op=ComparisonOp.GEQ,
            bound=0.0,
            relative=True,
        ),
    OutcomeConstraint(
            metric=experiment.metrics[
                “metric_b”
            ],
            op=ComparisonOp.LEQ,
            bound=5.0,
            relative=True,
        ),
    ],
)

closing this out due to lack of activity, but please reopen or raise a new issue for further assistance.

Abrikosoff commented 3 months ago

@mgarrard Hi Mia, thanks for getting back with this solution! It is still very much needed.