markrogoyski / math-php

Powerful modern math library for PHP: Features descriptive statistics and regressions; Continuous and discrete probability distributions; Linear algebra with matrices and vectors, Numerical analysis; special mathematical functions; Algebra
MIT License
2.33k stars 240 forks source link

Use power iteration and Dimensionality reduction for eigenvalues #340

Open Beakerboy opened 5 years ago

Beakerboy commented 5 years ago

I think I figured out the general technique for simultaneously finding eigenvalues and eigenvectors for any matrix...numerically.

use power iteration to find largest...this gives both the value and the vector, then subtract it from the source matrix and repeat.

$eigenvalues = [];
$vectors = [];
for ($i = 0l $i < $A->getM(); $i++) {
    list ($eigenvalue, $eigenvector) = Eigen::powerIteration($A);
    $eigenvalues[] = $eigenvalue;
    $vector = MatrixFactory::columnMatrix($eigenvector);
    $vectors[] = $eigenvector; // Adding as a new row
    $A = $A->subtract($vector->multiply($vector->transpose())->scalarMultiply($value));
}
$eigenvectors = MatrixFactory::create($vectors)->transpose();
markrogoyski commented 5 years ago

Is there a use case for this that differs from existing methods? Like it is better for large matrices vs Jacobi, for instance?

Beakerboy commented 5 years ago

The Jacobi method only works on symmetric matrixes.

markrogoyski commented 4 years ago

Would implementing this solve issue https://github.com/markrogoyski/math-php/issues/346 where power Iteration finds an eigenvalue that is not necessarily the dominant one because of the selection of the random starting vector? Basically, could you use this process to find all the eigenvalues, and then just figure out what the dominant one is and return that for the powerIteration method?

Beakerboy commented 4 years ago

That’s a good idea. Instead of finding just the “largest”, find a few and then grab the largest of those. Alternatively, I wonder if, after a solution is found, the solution eigenvector could be perturbed slightly and used as a new starting point to ensure that it converges on the same eigenvalue, and not one that is larger.