JuliaSmoothOptimizers / QuadraticModels.jl

Data structures for linear and quadratic optimization problems based on NLPModels.jl
Other
16 stars 10 forks source link

Using a structured `Q` matrix? #76

Closed mipals closed 5 months ago

mipals commented 2 years ago

Hello there,

I'm currently trying to solve a QP where the Q matrix is block-structured and the blocks can be inverted in linear complexity, meaning that I can invert Q easily (although its a little cumbersome). Is there a way to utilize this when solving the QP using this package? If not, do you know of any other package where this can be achieved?

Sorry for making an issue. But given the missing documentation it is hard to see whether or not this is possible.

Cheers, Mikkel

dpo commented 2 years ago

Unfortunately, a factorization of Q alone doesn't help much to compute a step in an interior-point method. Do you mean that Q is block diagonal? Does the constraint matrix also have a specific structure? If they both do, there may be a specialized factorization that we could use. It's the case, e.g., in dynamic programming where the constraints have a "staircase" structure.

dpo commented 2 years ago

On thinking about this, if you can provide an efficient operator to solve (Q + ρI) x = b, you could use an iterative method with the "K1" formulation (the normal equations).

@geoffroyleconte Is it currently possible for the user to pass in the operator above?

geoffroyleconte commented 2 years ago

You can create something like this, with Q being a LinearOperator, but it should be the Q matrix of the QP, not the inverse.

using QuadraticModels, LinearOperators
Q = opOnes(3,3)
c = rand(3)
A = opOnes(2, 3)
b = rand(2)
lvar = zeros(3)
QuadraticModel(c, Q, A=A, lcon=b, ucon=b, lvar = lvar)

However I think you need to be able to solve (Q + D + ρ I) z = b in z (where D = SX⁻¹) to use K1 no?

dpo commented 2 years ago

However I think you need to be able to solve (Q + D + ρ I) z = b in z (where D = SX⁻¹) to use K1 no?

Yes, sorry, I should have said a procedure to solve (Q + D) x = b, where D is diagonal (and changes at each iteration). It means re-factorizing Q + D at each iteration.

mipals commented 2 years ago

Thanks for the inputs!

My Q has the following form

$$ Q = \begin{bmatrix} K(K + n\lambda I) & K^\top H \ H^\top K & H^\top H \end{bmatrix}, \quad K\in\mathbb{R}^{n\times n},\ H\in\mathbb{R}^{n\times p},\ p \ll n $$

Here $K$ and $K + n\lambda I$ are structured such that they can be factored in $O(p^2n)$. My idea was to compute the inverse of $Q$ using a Schur-complement. Unfortunately I do not think things are as easily done when a diagonal is added.

In regards to the LinearOperators stuff. Do I understand it right that I would gain something if I made a LinearOperator version of $Q$ with a defined multiplication that scales $O(p^2n)$? (This would probably be very easy to do)

With regards to the constraints, then they look something like this

$$ \alpha D_r\begin{bmatrix}K & H\end{bmatrix}x \geq 0, \quad \text{where } \alpha \in {-1,1}, r\in{0,1,2} $$

where $D_r$ are tri-/bi-/diagonal matrices representing finite differences (the constraint represents bounds on derivatives).

dpo commented 2 years ago

Yes if you can implement an efficient matrix-vector product and wrap it as a LinearOperator, then you can try solving the problem via an iterative method.

Looking at your matrix, I'm not sure I understand the (1,1) block. Is $K$ symmetric and positive semi-definite?

mipals commented 2 years ago

Perfect, I will try that out then. Is it also possible to wrap the constraints in a LinearOperator?

Ahh yes, $K$ is both symmetric and positive semidefinite.

dpo commented 2 years ago

Perfect, I will try that out then. Is it also possible to wrap the constraints in a LinearOperator?

It's possible indeed. But note that this package only provides modeling facilities; not solvers. You can try RipQP to solve your problem (you won't be able to use commercial libraries like CPLEX or Gurobi; they don't support linear operators).