JuliaLang / julia

The Julia Programming Language
https://julialang.org/
MIT License
45.65k stars 5.48k forks source link

SymTridiagonal backslash with integer rhs #8857

Closed alanedelman closed 9 years ago

alanedelman commented 9 years ago

@andreasnoack as we discussed T\ (1:n) for example is currently a bug

andreasnoack commented 9 years ago

@alanedelman As mentioned, LAPACK doesn't have a symmetric indefinite solver for tridiagonal matrices. Hence our solver for SymTridiagonal actually promotes to Tridiagonal. We have our own ldltfact for which we can easily disable the check for positive definiteness and save the extra storage and flops at the the cost of some loss of precision because there is no longer any pivoting.

Below, I have plotted the mean of the 2-norm error of the solutions for each of the two methods with 95 pct. error bands. The elements of A and x are drawn independently from a N(0,1) and then b is computed. symtrierror The user has the option of promoting to Tridiagonal if she wants pivoting, so I think I'd go for no automatic promotion here to save the storage and flops.

jiahao commented 9 years ago

Some context would be nice

andreasnoack commented 9 years ago

The issue is that element type promotion doesn't work correctly for this method. This method has never been updated to follow the promotion pattern of all the other \s, but we have two options for a factorization to use.

Right now, \(SymTridiagonal,VecOrMat) promoted the SymTridiagonal to a Tridiagonal and calls the LAPACK function directly if possible. The linear solver for Tridiagonal uses a pivoted LU factorization. Therefore, this strategy doubles the memory requirement and also adds flops which I haven't counted.

An alternative is to use a LDLt factorization on the SymTridiagonal, but there is not a pivoted version of that factorization. At least not in LAPACK. The question is how much larger the error gets when not using pivoting. Hence, the graph. Is the larger error acceptable?