tBuLi / symfit

Symbolic Fitting; fitting as it should be.
http://symfit.readthedocs.org
MIT License
233 stars 17 forks source link

Lagrange Multipliers for constraints with any minimizer #238

Open tBuLi opened 5 years ago

tBuLi commented 5 years ago

Goal of this PR is to add the option for Lagrange multipliers to Fit. The basic API is as follows:

x, y, l = parameters('x, y, l')
f, = variables('f')

model = Model({f: x + y})
constraints = {
    l: Eq(x ** 2 + y ** 2, 1),
}

fit = Fit(model, constraints=constraints)

When the constraints are provided as a dict, the keys are interpreted as Lagrange multipliers (parameters), and the values as the constraints in the usual way.

In the background, fit will then determine the correct objective for f or will use the one provided by the user, and will then build the gradient of Lagrangian function L, see the wiki page. (For the example above, the objective will be MinimizeModel, but in scenarios with data it will be LeastSquares instead.)

We then fit when the gradient of the lagrangian function equals zero, instead of minimizing L directly. This is because in general L is not well behaved, and may have some stationary values but then drop off to -inf so minimization does not find the desired minimum but finds -inf instead.

This means we can introduce a new keyword to Fit: stationary_point. By default this is false, meaning we minimize the objective. For any model, setting stationary_point=True will find when the gradient of the objective is zero instead. When providing the constraints as a dict, stationary_point will be forced to True. This keyword seems to be beneficial in general, which is why I choose to expose it. See the Mexican hat test in this PR.

To Do:

Jhsmit commented 5 years ago

Are you getting the stew or the classic vol-au-vent?

tBuLi commented 5 years ago

You guys had stew or vol-au-vent today? I had a hamburger with spinach, Popeye would approve.