Closed divilian closed 2 years ago
An eigenvector of a matrix A is any vector that gets mapped to a scaled version of itself when multiplied by A ("A's resonant frequency"). The dominant eigenvector of a matrix is the eigenvector with the highest eigenvalue (coefficient of the eigenvector). Every time you multiply a matrix by M, it gets pulled towards M's dominant eigenvector.
The The Markov property refers to when a system's current state is dependent on its previous state. The dominant eigenvector for a Markov matrix (matrix of probabilities) gives you the long-run distribution of the system's state. Dominant eigenvectors are extremely important in the fields of math, physics, and computer science, with the PageRank algorithm being another popular use of them.
Resonant frequency: specific rate of oscillation, a “sweet spot.”
A matrix’s dominant eigenvector is simply the eigenvector with the highest eigenvalue. The matrix wants to map its inputs to the direction of the dominant eigenvector.
Frobenius norm: the square root of the sum of all these squared differences.
A Markov chain can be modeled as a matrix, which encodes all its conditional probabilities. If the previous state is X, here’s the probability that the current state will be Y .
Eigenanalysis has to do with eigenvectors, eigenvalues, eigendecomposition, and eigenbasis.
Before the PageRank algorithm, web search relied on calculating the relevancy of a page based on a query.
The PageRank algorithm takes into account the popularity of a webpage.
Determining a page's popularity: “A page is popular if lots of other popular pages link to it."
Combining relevancy with popularity works really well.
http://stephendavies.org/quick.pdf