Closed stevengj closed 1 year ago
Hi Pr. @stevengj , sorry for the late response. I had one concern for which I would have liked to know your opinion. The eigensolver I have implemented has one caveat: when zero is an eigenvalue of high multiplicity, then the "bulge" of the bulge-chasing algorithm may become zero itself. In this case, the convergence of the matrix to the real Schur form is not guaranteed anymore because the iterations stop to update the tail of the subdiagonal.
There are 2 main options:
-Create two solvers: the classical eigen
function goes back to the LAPACK solver which is more robust but slower. We also create a function fasteigen
, that calls the fast solver. This is the safe solution.
-I keep this like this for now and I try to find a solution this summer to that problem. (Unfortunately, my master thesis and my courses at university will prevent me from solving this problem sooner). Meanwhile, I recommend in the documentation not to use SkewHermitian
matrices if the rank of the matrix is much lower than its dimension.
I wondered what was your opinion about this.
Best regards
@dkarrasch and @andreasnoack, did you deal with a similar case in GenericLinearAlgebra.jl? It sounds like the sort of case that should be straightforward — ideally the bulge-chasing algorithm should be invariant to shifts A + µI
, after all.
I'm not sure I follow how a repeated zero eigenvalue could cause a zero bulge. Is that specific to how you do QR iterations for skew symmetric matrices? The bulge is usually created from the off diagonal element. If there is an isolated zero towards the end of the diagonal then it will cause a zero shift but there should still be a bulge to be chased. The shift is just for accelleration.
Is it because you are hitting one of the cases where the QR iteration leaves the matrix unchanged? See e.g. https://link.springer.com/article/10.1007/BF01385629. If that is the case then you might need to introduce exceptional shifts. You can see my way of dealing with it in https://github.com/JuliaLinearAlgebra/GenericLinearAlgebra.jl/blob/10b5265b75e123f236ee50b03c845c642462c30e/src/eigenGeneral.jl#L141-L145.
I'm not sure I follow how a repeated zero eigenvalue could cause a zero bulge. Is that specific to how you do QR iterations for skew symmetric matrices? The bulge is usually created from the off diagonal element. If there is an isolated zero towards the end of the diagonal then it will cause a zero shift but there should still be a bulge to be chased. The shift is just for accelleration.
Is it because you are hitting one of the cases where the QR iteration leaves the matrix unchanged? See e.g. https://link.springer.com/article/10.1007/BF01385629. If that is the case then you might need to introduce exceptional shifts. You can see my way of dealing with it in https://github.com/JuliaLinearAlgebra/GenericLinearAlgebra.jl/blob/10b5265b75e123f236ee50b03c845c642462c30e/src/eigenGeneral.jl#L141-L145.
julia> v =[1, 1, 1, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 1, 1]
julia> A = SkewHermTridiagonal(v) #I build a tridiagonal skew-symmetric matrix, v is the subdiagonal
julia> eigvals(A)
ERROR: Maximum number of iterations reached, the algorithm didn't converge
For tridiagonal skew-symmetric matrices, the bulge reduces to a single number. This makes the algorithm fast but the risk is that if there is a long range of zeros on the subdiagonal, then the bulge can become zero itself. In consequence, the tail of the subdiagonal is left unchanged and the algorithm stops converging. I think that a solution to that problem may be to re-initialize a bulge where it disappeared in order to update the tail.
This is without a doubt a corner case. One must want to solve exactly the previous problem. If the zeros are distributed and not grouped together, then there is no problem. For example, the following problem works fine because zeros are uniformly distributed:
julia> v =[1, 0, 1, 1, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1]
julia> eigvals(SkewHermTridiagonal(v))
@smataigne, it would be good to register this package … what should we do here?
@smataigne, it would be good to register this package … what should we do here?
I (finally) did it. The PR is waiting to be merged.
https://github.com/JuliaRegistries/General/pull/94115
What about the issue discussed above with repeated zero eigenvalues — is that still a problem?
Note that according to https://github.com/JuliaLinearAlgebra/SkewLinearAlgebra.jl/pull/125, the tests are failing with a StackOverflowError
on Julia nightly — issue #126.
Update: fixed by #127. I also updated the registry PR so that the version with this fix will be registered.
What about the issue discussed above with repeated zero eigenvalues — is that still a problem?
The problem still remains. However, I think it is okay to live with it for at least three reasons. The first is that this problem only happens for a corner case of tridiagonal matrices and I do not expect users to use this structure by themselves given the very few applications it has. The second is that dealing with this corner case is not trivial and would bring a slow down for all the other matrices. The third is that, for this corner case, the algorithm throws an error (rather than giving a wrong answer without warning the user), which is good news in some way. If the user really wants to solve this corner problem, using LinearAlgebra remains possible.
And, to be clear, we're talking about #118 here, right?
Is there anything that is blocking registering this officially? #118?