Open andreasnoack opened 6 years ago
As mentioned in #26410 there is some overlap there.
In the view of uniform API (sparse and dense case), it would be good to adapt the dense to the sparse case.
In the SPQR
algorithms tol
is needed during factorization and cannot be avoided. In the LAPACK
algorithm with column re-ordering, it is possible to postpone the evaluation of the rank and singularity to after the factorization. I don't think, though, that ldiv!
is the correct place to do that, because
qrf \ rhs
does not allow to specify a third parameter.qrfact(SparseMatrixCSC) takes a (currently undocumented, see #26592) tol argument
My approach would be to change the meaning of tol
to be a relative tolerance (which would internally be multiplied by an appropriate measure for the size of the matrix, which scales well). At several places (pinv
, rank
) it is a relative tolerance, for cholfact
it is absolute and passed down to LAPACK
. Also in the last case, the meaning of tol
is not yet documented completely.
I'm with @KlausC on this. In my code I need a rank-revealing QR of a matrix that could be sparse or dense (not for solving a linear system; I need information from Q and R).
Continues the discussion in https://github.com/JuliaLang/julia/pull/26392. Currently, there is a difference between when the rank is determined in sparse and dense pivoted QR. For our LAPACK based dense pivoted QR, the rank is not determined as part of the factorization but instead as part of the least squares solver. In contrast, the SuiteSparse based sparse pivoted QR determines the rank as part of the factorization. As a consequence,
qrfact(SparseMatrixCSC)
takes a (currently undocumented, see https://github.com/JuliaLang/julia/issues/26592)tol
argument whereas the tolerance in the dense case is specified inldiv!
(also undocumented). Hence, the size ofF.R
will depend on the rank in the sparse case but not in the dense whereldiv!
will do the truncation ofR
.The questions is if we should move the rank determination to the factorization step in the dense case as well. This would also be consistent with pivoted Cholesky (i.e. LAPACK is inconsistent here.). In practice, it will require a minor non-disruptive modification of the
QRPivoted
structure but it will probably simplifyldiv!
a bit as well.