Closed Abrikosoff closed 3 months ago
@Abrikosoff sorry for the delayed reply here, I would actually add this metric as an outcome constraint instead of an objective -- you can set a double constraint to set the range. In the example below this would be represented by "metric_b".
optimization_config = MultiObjectiveOptimizationConfig(
objective=MultiObjective(
objectives=[
Objective(
metric=experiment.metrics[
“metric_a”
],
minimize=True,
),
]
),
objective_thresholds=[
ObjectiveThreshold(
metric=experiment.metrics[
“metric_a”
],
op=ComparisonOp.LEQ,
bound=0.0,
relative=True,
),
],
outcome_constraints=[
OutcomeConstraint(
metric=experiment.metrics[
“metric_b”
],
op=ComparisonOp.GEQ,
bound=0.0,
relative=True,
),
OutcomeConstraint(
metric=experiment.metrics[
“metric_b”
],
op=ComparisonOp.LEQ,
bound=5.0,
relative=True,
),
],
)
closing this out due to lack of activity, but please reopen or raise a new issue for further assistance.
@mgarrard Hi Mia, thanks for getting back with this solution! It is still very much needed.
I have a multiobjective optimization scenario where, for one of the targets, I do not have a preferred optimization direction, but I do have a range into which the output values from the optimizer should fall. Previously this situation has been dealt with in an ad hoc manner by simply dropping that particular target, but this doesn't seem correct and not really in the spirit of performing MOBO; also the range constraint cannot be enforced.
In general this situation could be corrected by defining a function that maps all values outside of the required range back into the support of that target, and then setting the optimization direction to be minimization (for example, map function could be defined to be quadratic within the range), but I am unsure where I should put this map function. My idea is to define a custom metric which outputs the evaluation result of this map function, but this would:
Any ideas or suggestions would be really appreciated!