facebook / Ax

Adaptive Experimentation Platform
https://ax.dev
MIT License
2.34k stars 302 forks source link

applying complex constrains #2299

Closed StanleyYoo closed 1 month ago

StanleyYoo commented 5 months ago

Hi, I am trying to apply complex constrains to the SearchSpace. The SearchSpace comprises with 6 RangeParameters as: x00, x01, x02, x03, x04 and x05. The constrains I need to apply is the discriminant calculated follwoing procedure shall be larger than 0, i.e. discriminant >=0:

x05_rad = radians(x05)
x = x04 * sin(x05_rad)
y = x04 * cos(x05_rad)
x01 = radians(90 - x01)
own_x_vel, own_y_vel = vector_from_polar(x00, x01)
y_over_x = -(x / y)
a = 1**2 + y_over_x**2
b = (2 * own_x_vel) + (2 * y_over_x * own_y_vel)
c = own_x_vel**2 + own_y_vel**2 - x02**2
discriminant = b**2 - (4 * a * c)

However, the ParameterConstraint only support very simple contraints. How this could be resolved?

search_space = SearchSpace(
    parameters=parameters,
    parameter_constraints=parameter_constraints,
)

Thank you in advance. Stanley

Balandat commented 5 months ago

As you said, the standard parameter constraints do not support complex nonlinear constraints. This is for a few reasons, not least that this makes the acquisition function optimization a lot more challenging. It also makes it a lot harder to deal with the parameter transformations that Ax applies under the hood. See https://github.com/facebook/Ax/discussions/1797#discussioncomment-6827496 for a more detailed discussion.

What is the mathematical problem you're trying to solve? Is there a way to express the constraints in other coordinates?

StanleyYoo commented 5 months ago

Hi Balandat, Thanks for your response and the discussion at #1797 is well understood. Then do you have any suggestion to address my issue? the mathematical problem I need to solve is that I only want to confine the SearchSpace which only satisfy discriminant = b**2 - (4 * a * c) >= 0 requirement. Since otherwise, the sample from the SearchSpace passed to a next step and they won't work, meaning the entire BO process stopped since the next step yields empty result. The next step only work once real solution exists, i.e. discriminant = b**2 - (4 * a * c) >= 0. I have thinking about including 'discriminant' together with other parameters, x00, x01, x02, x03, x04 and x05 at SearchSpace but it hasn't worked since there is no correlation between 'discriminant' and others. If you have any idea to resolve this or even discarding/abandon a sample which are not satisfy the criteria discriminant = b**2 - (4 * a * c) >= 0, please let me know!

bernardbeckerman commented 2 months ago

We've recently deprecated discussions, which unfortunately killed @Balandat's link above. For posterity I'm copying the question and answer this refers to here.

Question from Stefan2016 on Aug 25, 2023

Dear all,

I have read in the Botorch repo that nonlinear constraints are now possible. From my understanding this should now also be possible with Ax, is this correct?

E.g., x^2 + y^2 ≤ 25 as a parameter_constraints the following code?

ax_client.create_experiment(
    name="hartmann_test_experiment",
    parameters=[
        {
            "name": "x1",
            "type": "range",
            "bounds": [0.0, 1.0],
            "value_type": "float",  # Optional, defaults to inference from type of "bounds".
            "log_scale": False,  # Optional, defaults to False.
        },
        {
            "name": "x2",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x3",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x4",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x5",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x6",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
    ],
    objectives={"hartmann6": ObjectiveProperties(minimize=True)},
    parameter_constraints=["x1 + x2 <= 2.0"],  # Optional.
    outcome_constraints=["l2norm <= 1.25"],  # Optional.
)

Answer from Balandat on Aug 25, 2023

HI @Stefan2016, unfortunately this is currently not easily possible. The string representations passed to the parameter_constraints arg are parsed internally into affine constraint objects, for which the coefficients and rhs are being passed down to the botorch models. This currently does not support nonlinear expressions. The main difficulties here are

  1. To use this in the optimization, we need to construct a callable that we can pass to the optimizer (with the default settings this would be the optimizers in scipy.optimize that are expected to map numpy arrays to numpy floats). It's hard to do that without allowing something like python's eval() (which we want to avoid) or do some reasonably bespoke string parsing. Another option would be to direcly pass that callable, but that has issues as well (see below).
  2. Serialization: Ax allows serializing the experiement to json / a database. While this does work with a string representation, it wouldn't with an arbitrary python callable defining the nonlinear constraints
  3. Transforms. Ax does some reasonably complicated transformation of the parameters and outcomes in its modeling layer (in order to be able to use consistent priors on model hyperparameters and avoid numerical issues). While applying these transformations to a linear constraint mapping is reasonably straightforward (and we can use the transformed constraint in the transformed space), it is not for arbitrary nonlinear constraints.

In short, there are a bunch of challenges here. We'd like to enable this feature but it's not straightforward and we don't have any concrete plans to work on it in the near future.

Balandat commented 2 months ago

@StanleyYoo since these are highly nonlinear constraints, it's not straightforward to support them easily via the AxClient API. You essentially have two options:

  1. Use our low level API and our EXPERIMENTAL support for nonlinear inequality constraints on the parameters there. See https://github.com/facebook/Ax/issues/769 for a long discussion on the topic and some examples. Note that there are lots of gotchas here (e.g. that this really only works if your search space is already the unit cube and potentially others that are mentioned in the discussion).
  2. Simply define a new metric, for each parameterization compute the discriminant and return that as the metric value and then impose an outcome constraint on that metric to be greater than zero. This will mean that the surrogate model will have to learn the discriminant as a function of the parameters. This is not going to be super efficient but is the easiest to hook up.
mgarrard commented 1 month ago

@StanleyYoo closing due to lack of activity. If you need additional support please feel free to re-open or start a new issue :)