SciRuby / nmatrix

Dense and sparse linear algebra library for Ruby via SciRuby
Other
469 stars 133 forks source link

Eigenvalue decomposition function on NMatrix #219

Closed translunar closed 6 years ago

translunar commented 10 years ago

Extends #19.

geev is exposed as a LAPACK function, but is terribly difficult to use. It'd be lovely if we could do eigenvalue decomposition directly on an NMatrix.

translunar commented 10 years ago

Okay, so I exposed the LAPACK function for this in NMatrix::LAPACK. It's geev. However, I'm not quite sure about how best to return the results. In particular, look at this example.

That needs to be dealt with before we can have NMatrix#geev, which is the next step.

thisMagpie commented 10 years ago

I tried running that example last night but could not because of include "mkl_lapacke.h" and these prices put a spanner in the works. The math kernel library is really expensive which is a shame, because it seems just the thing for fft too and uses gdb which I am fairly familiar with for getting traces. It seems really well documented too :-/ (I cannot make sense of the licensing though!)

I went running to the intel on linux website to see if there were any scraps but I don't think any of these packages are relevant or useful. I had a little look anyway but unfortunately I could not manage to get it going on my laptop (but the thing is close to dying so I try not to install much on it so I should probably give it a go on my desktop anyway for what that is worth).

Did you manage to run this without paying for the library? Do you reckon intel would wave the cost for SciRuby to use? If so, do you think there would there be an advantage to asking them?

I note nvidia and amd have some great documentation and similar libraries too.

Anyway I am digressing... Is that the sort of C example you were after? I am not sure what the particular problem you are finding with this that is not present in some other function call that works well, for comparison. Could you elaborate on this a little?

translunar commented 10 years ago

Usually Intel provides edu licenses for free for many of its products. I hadn't checked for MKL. MKL usability in NMatrix would be cool, but is secondary to getting full ATLAS support, since anyone can use ATLAS. Basically only people who work in academia would be able to take advantage of Intel code if that's how Intel licensing works — they're not going to provide it for use in NMatrix, because that dilutes their brand.

yoongkang commented 10 years ago

Ugh.. that does look pretty ugly. By the way, I don't think the result is being returned correctly for complex eigenvectors. The elements of the resulting eigenvector matrix are not complex. I believe the matrix of eigenvectors is created such that it is of shape [matrix.shape[0], matrix.shape[0]]. However, in the case of complex values, each element in that matrix only has either the real component or the imaginary component, i.e, it is not complex. You could try the example in the C file above to see what I mean, or try any matrix that has a complex eigenvalue.

We probably need to either: 1) double the columns for the eigenvectors with complex values, 2) double all columns for eigenvectors (use 0 if it's a real value), 3) make the elements hold complex values.

I think 3 would be good, so that eigenvalues[i] corresponds to eigenvectors.column(i). That should probably be the first step. Is there anything else that needs to be done before NMatrix#geev?

translunar commented 10 years ago

If I remember correctly, geev returns the real and imaginary components in two separate matrices. Can you give me a test case so I can have a look?

yoongkang commented 10 years ago

Sure, I'll do it once I get home. Unless I am misreading the documentation, I believe it does return two matrices, but they're for right and left eigenvectors, not for real and imaginary components.

yoongkang commented 10 years ago

In this gist we can see that the two matrices returned contain the left and right eigenvectors (as there is not supposed to be any imaginary components):

https://gist.github.com/yoongkang/adfaf19ad7e3f7d05506

In this gist we can see how the first and second columns hold the real and imaginary components respectively:

https://gist.github.com/yoongkang/2981c45012c076077fd5

The second eigenvector is missing.

rohit12 commented 8 years ago

I would like to take up this issue. I would like clarification on the issue though. Do I just have to calculate the second eigen vector?

translunar commented 8 years ago

Why only the second? On Fri, Mar 4, 2016 at 11:02 PM Rohit Shinde notifications@github.com wrote:

I would like to take up this issue. I would like clarification on the issue though. Do I just have to calculate the second eigen vector?

— Reply to this email directly or view it on GitHub https://github.com/SciRuby/nmatrix/issues/219#issuecomment-192572596.

rohit12 commented 8 years ago

The issue is not clear to me. That is why I am getting confused. It says that the second eigen vector is missing. So I assume that I have to calculate the second eigen vector?

rohit12 commented 8 years ago

@mohawkjohn Could I get a clarification please? I would really like to start working on this.

translunar commented 8 years ago

Oh. Sorry, I misread some things in here (and was not near a computer all weekend).

The second eigenvector is probably being computed but is being lost when the size of the resulting NMatrix object is set. So, no, you don't have to compute the second eigenvector, you just have to figure out the problem with the Ruby/C interface.

wlevine commented 8 years ago

I had completely forgotten, but I actually did a lot of this last summer. The output from NMatrix::LAPACK.geev is in a much friendlier format than the original geev. See the documentation and the specs. So the eigenvalues are already in a nicer format. I think the next step is to make a method NMatrix#eigenvectors or NMatrix#eigen_decomposition that would call NMatrix::LAPACK.geev on itself.

translunar commented 8 years ago

@wlevine Is it different enough that we should close this issue and open a new one?