One of my biggest hurdles learning linear algebra was getting the intuition of learning Algebra. Eigenvalues and eigenvectors are one of those things that pop up in a million places because they’re so useful, but to recognize where they may be useful you need intuition as to what they’re doing.
The eigenvectors are the “axes” of the transformation represented by the matrix. Consider spinning a globe (the universe of vectors): every location faces a new direction, except the poles. The eigenvalue is the amount the eigenvector is scaled up or down when going through the matrix.
Eigenvalues are special numbers associated with a matrix and eigenvectors are special vectors.
A matrix ‘A’ acts on vectors v like a function does, with input v and output Av. Eigenvectors are vectors for which Av is parallel to v. In other words:
Av = λv.
In this equation, v is an eigenvector of A and λ is an eigenvalue of A.
If you can draw a line through the three points (0,0), v and Av then Av is just v multiplied by a number λ; that is, Av=λv
In this case, we call λ an eigenvalue and v an eigenvector. For example, here (1,2) is an
Let us see a demo to understand visually what are the eigenvalue and eigenvector.
eigenspaces: change the columns of A and drag v to be an eigenvector. Note three facts: First, every point on the same line as an eigenvector is an eigenvector. Those lines are eigenspaces, and each has an associated eigenvalue. Second, if you place v on an eigenspace (either s1 or s2) with associated eigenvalue λ<1, then Av is closer to (0,0) than v; but when λ>1, it’s farther. Third, both eigenspaces depend on both columns of A: it is not as though a1 only affects s1.
What are eigenvalues/vectors good for?
If you keep multiplying v by A, you get a sequence v,Av,A^2v ^2v, etc. Eigenspaces attract that sequence and eigenvalues tell you whether it ends up at (0,0) or far away. Therefore, eigenvectors/values tell us about systems that evolve step-by-step.
Happy Machine Learning…