jump-dev / MatrixOptInterface.jl

An interface to pass matrix form problems
MIT License
11 stars 4 forks source link

Add basic presolve into MatOI #5

Open frapac opened 4 years ago

frapac commented 4 years ago

I open this issue to discuss a more long-term subject: I wonder if we could include basic presolve operations into MatOI? I think it would be easier to develop that inside this package instead of MOI, but I may be wrong. The idea would be to share the efforts and get good presolve abilities for LP's solver written in pure Julia, like Tulip.jl or Simplex.jl.

An idea would be to adapt already existing codes, like

I think Presolving in linear programming by E. Andersen and K. Andersen, is also a good paper to grasp inspiration from.

mtanneau commented 4 years ago

This is something I wanted to mention in my talk JuMP-dev talk (which will take place in the August jump-dev call). :)

TBH. I am not 100% sure that the benefits of shared efforts would outweigh the hurdles of additional dependencies and tight integrations.

As far as LP is concerned, my experience with presolve --and the reference you shared is a good example-- is that 99% of the work dates back from the 90s. The one notable exception might be https://github.com/lgottwald/PaPILO, which is mostly about making the whole thing parallel. I found that the main complexity in implementing a presolve procedure was to match the paper's notations to whatever form & data structures I was using.

odow commented 4 years ago

I have been wanting something like a "model bridge" for a while. It is an optimizer which wraps another solver, and does transformations on the model before passing it to the inner solver. We already have some examples

This preserve can live by itself as a separate package. We don't need to fold it into MatrixOptInterface. I think the plan is for MatrixOptinterface to join MOI as a sub package anyway.

frapac commented 4 years ago

Thanks for your feedback Mathieu! I am looking forward for your presentation then, that would be the occasion to push this discussion further :) In my opinion, Oscar's idea is very nice. Good presolve justifies part of Ampl's performance, and that would be great to have an equivalent in JuMP's ecosystem. In the long term, we could even plan in integrating automatic model relaxations for non-convex and MINLP problems. I think there is a lot of potential for MOI's bridges with this (à la Alpine, but at the MOI level).

mtanneau commented 4 years ago

If we're thinking larger than LP, I think the class of problems that should be targeted are linear/quadratic conic optimization problems, which write

min   x'Qx + c'x
s.t.  L ⪯ A x ⪯ U
      l ⪯   x ⪯ u
      xj integer, ∀j ∈ J

where each is a conic inequality.

AFAIK conic problems are the largest class of problems that can be written in a matrix-oriented form as above (and for which there is a nice duality theory). It is also something that could be used by, e.g., Convex, Dualization, Pajarito.

matbesancon commented 3 years ago

it seems presolve efforts for LPs has gone into https://github.com/mtanneau/MathOptPresolve.jl/, we should avoid redundancy except if presolve here has a different scope

frapac commented 3 years ago

I agree. And implementing a thin wrapper to MatOI on top of MathOptPresolve should be doable in a few lines of code.

matbesancon commented 3 years ago

Closing this then?