Open SuperFluffy opened 8 years ago
I don't think there's anything wrong with algorithms that aren't direct translations of LAPACK drivers or computations routines. I think they would probably require a higher bar for testing, though. Most of the existing testing is essentially at the interface level, since LAPACK is doing the heavy lifting. A novel implementation would inherently be treated more skeptically.
In this case, though, LAPACK does provide QR-decomposition, via ?geqrf
, for m x n matrices. The direct output is opaque, but you can use ?orgqr
to recover the Q matrix, which is precisely the GSO.
So, if you're just talking about GSO, I would prefer an implementation to go that route.
I should also mention that I do want factorizations (QR, LQ, Cholesky) to be exposed in a user-friendly way in linxal
. So, if you don't mind waiting a couple of days, I will probably push code in the next couple of days that will at least make QRs easy to use.
Very valid points. If I am not mistaken though, LAPACK uses Householder reflections instead of Gram-Schmidt, don't they?
The reason I was looking into modified Gram Schmidt is that I get the normalization factors of the orthogonal vectors as a side effect of the reorthogonalization procedure. With ?geqrf
and ?orgqr
this route would take a few extra steps.
Ah I see. I think you're right; I don't think you can recover those normalization constants without significant extra work.
In that case, I'd be happy to accept a modified GSO implementation.
Hey! What's your opinion on this? https://github.com/SuperFluffy/gram_schmidt
It's a pretty straight forward implementation, and I am using it in production for a while now. It is of course missing some documentation.
In terms of implementation style I have focused on what you are doing. Although I am not sure about the structure of, e.g. See issue #5 for that.impl SVD for $impl_type
instead of implementing these traits on ArrayBase
with different elements (since, after all, the functions are not supposed to act on the primitives but on the matrices).
Looks good for floats. You'll certainly have to modify it a bit to have it work for complex vectors as well, since ndarray
's Dot
doesn't do a proper complex inner product. linxal
already has a conjugate method defined on LinxalScalar
, but it would probably better to have a generic InnerProduct
trait, so that one could use BLAS for the heavy lifting.
Do you welcome PRs implementing more Linear Algebra procedures which might not necessarily be based on LAPACK?
I need to write a (modified) Gram Schmidt procedure for one of my problems. I was quite surprised that LAPACK doesn't provide anything for that itself. Probably because it's too simple... So a pure Rust implementation it is. :)