Open mikera opened 10 years ago
@Gerrrr - any thoughts from the Incanter side? @prasant94 - good to think how this interacts with your work on vectorz-clj
In Incanter eigenvalue decomposition works even worse on complex eigenvalues: https://github.com/incanter/incanter/blob/master/modules/incanter-core/src/incanter/core.clj#L995 https://github.com/tel/clatrix/blob/master/src/clatrix/core.clj#L1167 It returns either real or imaginary part.
In my branch it is just a wrapper around current core.matrix
solution.
https://github.com/mikera/core.matrix/wiki/Linear-algebra-API-proposal
My bad, I did not update eigen
docstring in the code. If you ok with API from wiki, I will do that.
A couple of additional thoughts:
eigen-sym
or similar to compute the eigendecomposition of a symmetric matrix. This would avoid these issues.A
Tricky one - having lots of options here is nsaty because it pushes burden onto implementations to implement all the cases correctly. It might be better to split up the function....
Hi, I've been watching core.matrix for a while. Thought I'd chime in and give my 2 cents.
Aren't we saying too much with M = Q.A.Q-1? What do we return when M is a defective matrix? The relation can't be guaranteed in general.
edited: -- Forget this part, I didn't take the :return parameter into account Often, we just want the eigenvalues and computing the eigenvectors would be a waste. Perhaps a separate function for the eigenvalues would be best.
Also, I agree a vector of eigenvalues instead of a matrix with the eigenvalues on the diagonal might be more desirable. (Though the user can easily switch between them with the diagonal-matrix and main-diagonal functions)
I've looked at the API various CAS's to see how they did it:
Returns a vector of eigenvalues followed the vector of eigenvectors (basically Q). Returns zero vectors for defective matrices. (Also has separate Eigenvector and Eigenvalues functions)
Takes options for only eigenvalues or symmetric-matrix. Returns essentially the same as Mathematica.
Returns a vector eigenvalues (optionally as a matrix) and the eigenvectors as an optional output. Will return linearly dependent eigenvectors if the matrix is defective.
Returns triples containing the eigenvalue, a vector of eigenvectors for that eigenvalue and the multiplicity. This nicely sidesteps the edge cases, but if you really want a matrix of eigenvectors the user needs to do some work/checking.
I personally like the Sage solution, but integrating this with other libraries' implementations will probably be difficult.
@dmarjenburgh thanks for the insights!
I agree we may want a separate function for eigenvalues. That was sort of the idea with the :return
parameter, but it may be simpler to create separate functions in some cases. It's a tricky one: there is a trade-off between simplicity of API, performance and ease of implementation I think (remember that any complexity we put in the API has to be supported by all core.matrix implementations)
Also agree that a symmetric-eigen
method may make sense. This is particularly important I think where a matrix should be symmetric but might not be exactly symmetric due to numerical rounding errors - in this case you want to consider it as symmetric.
I think we really need to look at some use cases: what APIs are most convenient for the common cases?
Hard. Different use cases (sparse vs dense, symmetric vs non-symm, etc) probably call for different algorithms with different return types. If we impose too hard restrictions on the return type it will be difficult to map the return type of some implementations to it.
I can look into some use cases and APIs and make some tables with pros and cons.
I think we can safely assume the existence of matrices and vectors as return types. I think it is more about the specification of the inputs / outputs.
For vectors we don't really care about copying: It is pretty cheap to copy a 1D vector of eigenvalues compared to the cost of actually performing the decomposition, for example.
Agree about the sparse stuff.... but I think spare implementations can generally handle this internally (e.g. the Vectorz sparse support is mostly transparent to core.matrix).
Sorry for the long silence (vacation time).
I'm afraid any list of use cases for matrices I can give will be horribly incomplete. The following two categories are very broad and common though:
Considering this, I would vote for separate 'eigenvectors', 'eigenvalues' and 'eigen' functions. This leads a nicer API as well I think. Regarding separate symmetric-eigen* versions, I'm not so sure. I like the idea, but how about triangular or tridiagonal matrices. Would they have separate protocols too?
Lastly, I've compared the following most common outputs options for convertability:
[:value :vectors :multiplicity]
. What Sage does[:L :Q]
or [:Q :rL :iL]
(L is the vector of eigenvalues). Together with one of the following options for how to handle defective matrices:
Convertability: RZ -> RD: Zero vectors in a set of vectors are already linearly dependent. Nothing to do. RD -> RZ: Need to compute the rank of the eigenspace for each eigenvalue. Costly.
RZ -> S : Zero vectors are easily identified and eigenvalues can be counted with frequencies. Easy, but cumbersome. S -> RZ : Pad eigenvectors with zero vectors if necessary for defective matrices. Create a matrix from the concatenation of vectors. Doable, but cumbersome transformations if you want Q or A.
RD -> S : Here we also need to check the geometric multiplicity and is costly. S -> RD : Easy through RZ. Basically identical.
Supporting an implementation using RD is costly no matter what.
If the outputs are easy to convert into each other we could add it in an option parameter, but it's probably better to keep the API clean and simple and let the user convert the output to the desired format. Having output type RZ looks to me to be the most straightforward.
Well, I hope I'm not too cryptic. Let me know :)
On Wed, Oct 1, 2014 at 8:28 PM, Daniel notifications@github.com wrote:
Sorry for the long silence (vacation time).
I'm afraid any list of use cases for matrices I can give will be horribly incomplete. The following two categories are very broad and common though:
- Spectral analysis, signal processing and solving linear differential equations for problems involving periodicity. Very often involves symmetric (or Hermitian) matrices. For these, the eigenvectors and eigenvalues are both important, but sometimes you just want on or the other.
- Matrices with graphs; Markov stuff, PageRank. Often has transition matrices in which you might want to compute high powers of a matrix, where diagonalisation helps. Often, the eigenvectors are more important than the eigenvalue.
Considering this, I would vote for separate 'eigenvectors', 'eigenvalues' and 'eigen' functions. This leads a nicer API as well I think. Regarding separate symmetric-eigen* versions, I'm not so sure. I like the idea, but how about triangular or tridiagonal matrices. Would they have separate protocols too?
I think having an options argument to such (complex) linear algebra function where multiple fast algorithm exists for special cases etc is not a bad idea. That way, for example, a tridiagonal keyword could be passed to the matrix implementation meaning use an optimized algorithm for tridiagonal matrices if you can (throw if not ?)
Lastly, I've compared the following most common outputs options for convertability:
-
S => return vector of maps with keys [:value :vectors :multiplicity]. What Sage does
R => return map with [:L :Q] or [:Q :rL :iL](L is the vector of eigenvalues). Together with one of the following options for how to handle defective matrices:
- Z => return zero vectors when geometric multiplicity is lower than the algebraic multiplicity
- D => return linearly dependent eigenvectors for defective matrices
Convertability: RZ -> RD: Zero vectors in a set of vectors are already linearly dependent. Nothing to do. RD -> RZ: Need to compute the rank of the eigenspace for each eigenvalue. Costly.
RZ -> S : Zero vectors are easily identified and eigenvalues can be counted with frequencies. Easy, but cumbersome. S -> RZ : Pad eigenvectors with zero vectors if necessary for defective matrices. Create a matrix from the concatenation of vectors. Doable, but cumbersome transformations if you want Q or A.
RD -> S : Here we also need to check the geometric multiplicity and is costly. S -> RD : Easy through RZ. Basically identical.
Supporting an implementation using RD is costly no matter what.
If the outputs are easy to convert into each other we could add it in an option parameter, but it's probably better to keep the API clean and simple and let the user convert the output to the desired format. Having output type RZ looks to me to be the most straightforward.
Well, I hope I'm not too cryptic. Let me know :)
— Reply to this email directly or view it on GitHub https://github.com/mikera/core.matrix/issues/184#issuecomment-57512533.
The only downside I see is that it becomes unclear which implementations support which optional keys, making the abstraction core.matrix provides more leaky. While if we separate special cases in different protocols, the separation will be more explicit, but perhaps also unwieldy to use.
Yeah.... keyword arguments make things complex for implementers, they need to be very precise in how behaviour works for all combinations of parameters.
We've already seen variance between implementations. Guess testing can pick these up, but it's a bit sticky.
I'm probably more comfortable with options arguments where the option just works as a hint (presumably for performance optimisation) and doesn't change behaviour.
For the completeness, the NumPy API consists of four functions:
eig
eigh
-- for symmetric and Hermitian matriceseigvals
-- only eigenvalueseigvalsh
-- only eigenvalues for symmetric and Hermitian matricesEigenvalues are always returned as a 1D array.
See http://docs.scipy.org/doc/numpy/reference/routines.linalg.html#matrix-eigenvalues
Matlab API has also the
eigs
function to calculate the largest eigenvaluehttp://www.mathworks.com/help/matlab/ref/eigs.html
I'd also suggest to use :hermitian
keyword, because it's a more general concept, and such an API going to be more future-proof.
An observation about :return keyword.
I believe that the choice of supported keywords is unfortunate. Imagine reading some end-user code doing calculations. What Q
, rA
, and iA
do mean in that context? Nothing at best, or likely some other unrelated quantities.
I believe that the API can be improved if it used longer self-descriptive keywords (:eigenvectors
, :eigenvalues
, :eigenvalues-real
, :eigenvalues-imaginary
, :eigenvectors-inversed
) or multiple functions like in NumPy. The keywords can be abbreviated (:eigvecs
, :eigvals
, :eigvals-real
, :eigvals-imag
, :inv-eigvecs
).
In other words, I think any of these options:
(eigen m {:return :eigenvalues})
(eigen m {:return [:eigvals-real :eigvals-imag]})
(eigenvalues m :real)
(eigenvalues m [:real :imaginary])
is more readable than
(eigen m {:return [:rA :rI]})
By "readable" I mean we can guess what it does and what it returns without looking into documentation.
@astanin I like your proposals, certainly helpful to have more readable keywords here
Happy to take patches that improve things here, though it may cause some downstream breakage. I think that is acceptable right now while we are still in the process of defining the API
The API for eigen-decomposition currently has two issues:
I think we should refine the API to fix there issues.