Open MikaelSlevinsky opened 8 years ago
The DESCM on semi-infinite domains where the eigenfunctions decays algebraically at 0 and exponentially at infinity have badly conditioned matrices because the condition number of the matrix D2 exceeds 10^16 quite rapidly. Since LAPACK for Julia only deals with double precision, it makes sense that we would not have that many eigenvalues that converged. However, it is strange that the method is doing worse than before.
Can we use our own Arnoldi or Lanczos? It should be straightforward to use SincFun's SincMatrix
to get fast and data-sparse matrix-vector multiply.
I tried coding my own version of a generalized eigenvalue solver about a year ago and found it incredibly challenging. Eigenvalue solvers are very finicky it's no easy task. But if you're up to the task, that would certainly be useful!
SincFun has Lanczos
and steig!
(symmetric tridiagonal) methods. I could help by increasing their performance, but you can only get so much out of eigvals
.
I agree. If you can get your Lanczos algorithm to be just as performant or better, we can definitely make the switch.
Prior to today, the second order differentiation matrix's infinity-norms of the error on the sinc points was:
Now, I updated the code to make it essentially allocation-free, and the infinity-norm is also smaller:
The tests fail because it's not clear how many eigenvalues will be returned, just the ones below the threshold tolerance. And somehow the results are worse in that there are fewer eigenvalues returned than before?