JuliaOpt / MathProgBase.jl

DEPRECATED: Solver-independent functions (i.e. linprog and mixintprog) and low-level interface for Mathematical Programming
Other
80 stars 38 forks source link

Getting coefficients or constraint matrices for linear parts of nonlinear problems #50

Closed tkelman closed 8 years ago

tkelman commented 9 years ago

I can ask whether the objective function is linear, but is there any better way of easily getting the objective coefficients (ideally as a sparse vector, or equivalently the index and value vectors) than evaluating the gradient?

And JuMP.prepConstrMatrix is a useful feature of providing the constraint matrix for the linear set of constraints, but it doesn't appear to be exposed through MathProgBase for nonlinear problems. Maybe this should be a JuMP issue of providing an impementation of buildInternalModel for nonlinear that does what prepConstrMatrix does, then MathProgBase.getconstrmatrix could be used for this?

mlubin commented 9 years ago

Given that it's a pretty straightforward transformation of the gradient/jacobian to extract the linear parts, I'm not sure if it's worth adding any extra complication to the MPB interface. This is also pretty standard, e.g., you can tell Ipopt and KNITRO if the objective/constraints are linear and internally it will reconstruct the sparse linear form.

tkelman commented 9 years ago

It's a great deal of extra allocation and zero-checking to write into a dense gradient vector to get out what might only be a handful of nonzeros. And similarly with the constraint jacobian, the nonlinear parts are possibly much more expensive to calculate than only copying out the already-constructed components that are known to be linear.

At the moment I have little interest in connecting to any modeling frameworks aside from JuMP, but there don't seem to be any great ways of making a JuMP-specific solver. The architecture is forcing me to go through the limited API of MathProgBase and lose a lot of valuable information in the process.

mlubin commented 9 years ago

You could argue that if an AbstractNLPEvaluator says that the objective or a constraint is linear, then the corresponding expression it returns needs to be in a simplified form from which it's easy to build a sparse vector. Would that work?

tkelman commented 9 years ago

Sure, worst case I can walk the expression and hope / assume it's in a specific form, but code to do exactly this is clearly already written in JuMP. Is there any way of exposing what's already there to a solver instead of having to rewrite it myself?

mlubin commented 9 years ago

The MPB interface was designed for this, we don't really want JuMP-specific solvers any more than AMPL or GAMS-specific solvers. Usually our litmus test for extending the MPB interface is if the new function can be implemented by more than one solver/modeling interface. Right now the most likely next implementation of the expression graph interface will be with AMPL.jl. If it's easy to extract the linear constraint jacobian and sparse objective coefficient vector from AMPL then it's probably reasonable to add these to the MPB interface. CC @dpo

mlubin commented 9 years ago

Also JuMP can't natively provide a sparse objective vector. It needs to allocate a full dense vector in order to merge possible duplicate terms (though it's true that we can avoid most zero checking). I really can't see a pass though an n-element vector having a significant effect on performance in the context of solving a nonlinear optimization problem, especially when the whole problem is already being written out to xml.

tkelman commented 9 years ago

I don't expect the objective by itself to be a major cost either, but it's more or less the same transformation for a linear objective as for every linear constraint (of which there might be many) that I would need to do to convert from an expression tree to a sparse vector. In order to interact with LP solvers I imagine most modeling frameworks will have fast-paths towards representing linear terms this way. Asking to make a round trip from internal representation, to expression tree, and back to a sparse form seems redundant and unnecessary.

mlubin commented 9 years ago

In order to interact with LP solvers I imagine most modeling frameworks will have fast-paths towards representing linear terms this way.

MPB does have a fast path for interacting with LP solvers, though the loadproblem! interface. For MINLP solvers however, it's not really clear to me that sending everything through the expression interface (or making a single call to evaluate the jacobian) will have an impact on performance. Unless I'm completely off on my intuition, the work required for this should be bounded by the work needed to solve a couple of branch & bound nodes.

tkelman commented 9 years ago

JuMP's current implementation of MathProgBase.initialize(d, [:ExprGraph]) does have a noticeable compilation delay (some or all of which might go away if JuMP were precompiled, it's hard to tell at this point). Unless it's required by the current implementation of the *_expr functions, an expression-graph-only interface should hypothetically be able to avoid evaluation and compilation of all the Julia-side autodiff and callback functions which I know I won't need. So I'd like to avoid asking ReverseDiffSparse to do any work in creating the nonlinear parts of the Jacobian that I plan on throwing away.

If we were doing more work in CSR then it might be a bit more natural to ask for only a subset of rows of the Jacobian. As things stand right now I'll go forward with writing the inverse of affToExpr and see whether or not the intermediate data structures are actually worth worrying about.

mlubin commented 9 years ago

You're right, right now we're still preparing the Jacobian and Hessian when only ExprGraph is requested. No reason for this other than laziness (https://github.com/mlubin/ReverseDiffSparse.jl/issues/9). We do need some compilation/evaluation step even when just building the expression graphs because we have to expand out sum{}, prod{}, and constants depending on the input data.

mlubin commented 8 years ago

You can now access the ExprGraph from JuMP without extra overhead. Closing this issue for the reasons already discussed.