Open odow opened 8 months ago
What about
@constraint(model, [x, F(x), b - A * x] in MOI.VariationalInequality(2length(x), MOI.Nonnegatives(size(A, 1))))
It generalizes
@constraint(model, [x, F(x)] in MOI.Complement(2length(x)))
as MOI.Complement(n)
is now equivalent to MOI.Complement(n, MOI.Nonnegatives(0))
.
The advantage of this approach is that you could write a bridge that gets the polyhedron description, computes the double description and then do some fancy reformulation.
Another approach is using constraint attributes but it's a bit orthogonal to the current design of the other constraints where we prefer a nice self-contained function-in-set representation (also plays better with MathOptSetDistances, Dualization, and the rest of our abstraction) than having the info spread out
@constraint(model, [x, F(x), b - A * x] in MOI.VariationalInequality(2length(x), MOI.Nonnegatives(size(A, 1))))
I was thinking something about this. But it's pretty clunky to build. I think people would want to build up the polyhedron with @constraint
, instead of b
and A
.
@xhub do you have some simple pedagogical examples of polyhedral VI's that people would want to solve? (In the vein of https://jump.dev/JuMP.jl/stable/tutorials/nonlinear/complementarity/)
Pick whatever syntax you like for now.
Many interesting AVI comes from solving Nash equilibrium. That's not the simplest examples.
I do have one example of solving a friction contact problem. Next 2 weeks are quite busy, I'll try to provide something afterwards.
Okay, so we could make some nice syntax:
model = Model()
@variable(model, x[1:2] >= 0)
F(x) = [1 - x[1], 1 - x[2]]
C = [@build_constraint(model, sum(x) == 1)]
@constraint(model, F(x) ⟂ x, C)
which lowers to:
model = MOI.Utilities.Model{Float64}()
x = MOI.add_variables(model, 2)
MOI.add_constraint.(model, x, MOI.GreaterThan(0.0))
F(x) = [1 - x[1], 1 - x[2]]
C_f = [sum(x) - 1]
C_s = MOI.Zeros(1)
f = MOI.Utilities.operate(vcat, Float64, F(x), x, C_f)
s = MOI.VariationalInequality(2 * length(x), C_s)
MOI.add_constraint(model, f, s)
The downside is that you'd have to provide the VI in a single call. It wouldn't necessarily make sense to do something like:
model = Model()
@variable(model, x[1:2] >= 0)
C = [@build_constraint(model, sum(x) == 1)]
@constraint(model, 1 - x[1] ⟂ x[1], C)
@constraint(model, 1 - x[2] ⟂ x[2])
I must say I always found it a bit weird that the complementarity constraint implicitly uses the variable bounds in:
model = Model(PATHSolver.Optimizer)
set_silent(model)
@variable(model, 0 <= x[1:4] <= 10, start = 0)
@constraint(model, M * x + q ⟂ x)
maybe this is a good opportunity to improve the syntax (of course we'd need to keep the old one to not break code).
The standard complementarity could be something like:
model = Model(PATHSolver.Optimizer)
set_silent(model)
@variable(model, x[1:4] , start = 0)
@constraint(model, M * x + q ⟂ x, begin
[i=1:4], 0 <= x[i] <= 10
end)
And the new VI could be:
model = Model(PATHSolver.Optimizer)
set_silent(model)
@variable(model, x[1:2])
@constraint(model, [1 - x[1], 1 - x[2]] ⟂ x, begin
sum(x) == 1
end)
@build_constraint
feels unnatural for the base use case.
I could understand it being used as secondary syntax:
model = Model()
@variable(model, x)
@variable(model, y)
V = [x, y]
F = [@expression(model, 1 - x[1]), @expression(model, 1 - x[2])]
C = [@build_constraint(model, x + y == 1)]
@constraint(model, F ⟂ V, C)
The latter would be equivalent to @blegat 's:
@constraint(model, [V, F, x+y] in MOI.VariationalInequality(2length(x), [MOI.EqualTo(1)]))
I must say I always found it a bit weird that the complementarity constraint implicitly uses the variable bounds
Okaaaay. So. The backstory for this is:
F(x)
, the Jacobian J(x)
, a vector of lower bounds lb
, a vector of upper bounds ub
and a starting point z
: https://github.com/chkwon/PATHSolver.jl/blob/7d6de732efd4db2f9b9f70e6d315d95f8e200f30/src/C_API.jl#L171-L1750 <= F(x) ⟂ x >= 0
or F(x) ⟂ a <= x <= b
, or in some cases, 0 <= F(x) ⟂ a <= x <= b
. This can easily lead to situation which are not valid MCPs.For (2), see, for example
This design decision in GAMS is VeryBad™ because it is a silent cause of bugs.
Michael and I had some very long discussions, and decided that the one true way to write an MCP was what we currently have. Users cannot give bounds on F(x)
or x
in the complementarity condition, because experience has shown that they will get them wrong.
If we have AVIs, then I do see some reason to support:
model = Model(PATHSolver.Optimizer)
@variable(model, x[1:4], start = 0)
@constraint(model, F(x) ⟂ {a <= x <= b})
[F(x), x]
in MOI.VariationalInequality(2length(x), Interval.(a, b))
.
But if we have MPECs, then it's a bit weird, because you could have both variable bounds an a rectangular VI???
What would I do in this situation?
model = Model()
@variable(model, -1 <= x <= 1, start = 0)
@objective(model, Max, x)
@constraint(model, 1 - x ⟂ {x >= 0})
Are the bounds of x
[-1, 1]
, or [0, 1]
?
But if we have MPECs, then it's a bit weird, because you could have both variable bounds an a rectangular VI???
What would I do in this situation?
Hard to follow your train of thought on that one without an example. I agree that VI constraints in an optimization problem usually require further clarification of the semantics
Are the bounds of x [-1, 1], or [0, 1]?
Throw an error and enlighten the modeler of their mistake.
Why not support both and throw an error if bounds on the variables are given twice (in @variable
and @constraint
)?
Hard to follow your train of thought on that one without an example
I meant the example immediately following.
Why not support both and throw an error if bounds on the variables are given twice
We could do this.
How about:
model = Model()
@variable(model, x, start = 0)
@objective(model, Max, x)
@constraint(model, 1 - x ⟂ {x >= 0})
lower_bound(x) # 0? or an error?
set_lower_bound(x, 1) # Modifies the VI? Or adds a bound (and therefore throws an error)?
Interestingly, for me it is very clear that what should happen. If we have:
model = Model()
@variable(model, x, start = 0)
@objective(model, Max, x)
@constraint(model, 1 - x ⟂ {x >= 0})
Then
has_lower_bound(x) == false
and
lower_bound(x)
Errors!
As x >= 0
is PART of the constraint, and there is no bound in the variable.
In the same way that @constraint(model, x >= 0)
would no affect the value returned by lower_bound(x)
And
set_lower_bound(x, 1)
Modifies the variable bounds and NOT the VI.
In the same way that set_lower_bound(x, 1)
would not affect a @constraint(model, x >= 0)
.
There would be no way to modify that part of the complementarity constraint as of today.
However, we can add a new specific function for that or simply force users to delete and re-add.
Complementarity solvers probably won't have in-place modifications and nice startups, so the new function could be created if it is needed in the future.
My understanding probably comes from the fact that I am more of an MPEC than an MCP person.
About:
model = Model()
@variable(model, -1 <= x <= 1, start = 0)
@objective(model, Max, x)
@constraint(model, 1 - x ⟂ {x >= 0})
MCP solvers should throw an error like @xhub said. Probably should even check for @constraint(model, x >= 0)
kind of errors as well.
But it is a perfectly valid MPEC.
In particular, I suffer with this in BilevelJuMP, as the implicit bounds in the variables mess things up. BilevelJuMP would greatly appreciate it if complementarity/VI-like constraints were defined in a single self-sufficient piece. This is because BilevelJuMP accepts MPECs. Hence, something like the above invalid MCP would be valid in BilevelJuMP.
Also, knitro only accepts 0<= x ⟂ y >= 0
, no other bounds, only variables.
https://www.artelys.com/app/docs/knitro/2_userGuide/complementarity.html
So KNITRO would be VectorOfVariables
in VariationalInequality{Nonnegatives}
An upcoming release of PATH will support polyhedral variational inequalities. I don't have anything actionable to do yet, but I had a chat with Michael Ferris yesterday, so this issue is to record my notes.
The new release solves the problem of finding $x$ such that $0 \in F(x) + N_C(x)$, where $F(x)$ is a $R^n \rightarrow R^n$ nonlinear function and $N_C$ is the normal cone of $C$ where $x\in C$.
Currently, we use the
Complements
set $F(x) \perp x$ to be interpreted as a rectangular variational inequality, where the set $C$ is the variable bounds onx
.The new version will support $C = \{x: Ax <= b\}$.
Note that this is not the same as a rectangular variational inequality plus some inequality constraints, so this is not sufficient:
We either need some way of saying that the linear constraints are actually part of the VI:
or we need some entirely new syntax.
Still TBD, but I think PATH + GAMS will use this "tagging" approach of annotating the constraints that are really the VI and not constraints. We could do this with constraint attributes, or by lowering to the constraint to
Ax * x - b in VariationalInequality(Nonnegatives())
.The key thing is that we need three pieces of information
MOI.Complements
achieves 1 and 2, and 3 is left implicit. We chose the current design because it matches exactly the information that PATH expected at the time.This has been discussed before: https://github.com/jump-dev/MathOptInterface.jl/issues/771#issuecomment-536349391
cc @xhub