Eigen Intuition
Eigenvalues and eigenvectors are fundamental concepts in linear algebra with significant applications in various fields such as image processing, recommender systems, network analysis, and principal component analysis (PCA). Understanding eigenvalues and eigenvectors allows us to extract key information from matrices and gain valuable insights into data transformations. In this article, we explore these concepts to establish intuition from a geometric interpretation.
Matrix Transformations
For a matrix $A\in \mathbb{R}^{n\times n}$ and a vector $\bar{x}\in \mathbb{R}^n$, the transformation $A\bar{x}$ can take on three distinct forms:
- Rotation: The direction of x is altered, but its length remains the same.
- Scaling: The magnitude of x is modified, while the direction remains unchanged.
- Rotation and Scaling: Both the direction and magnitude of x are changed.
Let’s consider a motivating example to illustrate these transformations.
Consider a plot of the matrix A:
In the plot above, we have:
- $\bar{x}$: Vectors to be transformed (forming a unit circle)
- $A\bar{x}$: Transformed x vectors (forming an ellipse)
- $\bar{v}_1$, $\bar{v}_2$: The two eigenvectors of matrix $A$
The above $A$ has the following eigenpairs:
Now let’s look back at the above plot…
- $x$: Vectors to be transformed (form a unit circle)
- $Ax$: Transformed $x$ vectors (form an ellipse)
- $v_1, v_2$: The two eigenvectors of the matrix $A$
Vectors on $\text{span}\lbrace\bar{v}_1\rbrace, \text{span}\lbrace\bar{v}_2\rbrace$ are in fact the only $x\in\mathbb{R}^2$ such that the transformation $Ax$ preserves $x$’s direction (i.e. only changes its length). These are eigenvectors. Let’s visualize this important property of eigenvectors…
As expected, the eigenvectors $\bar{v}_1,\ \bar{v}_2$ are each scaled, but not rotated! Notice that $\bar{v}_2$ is scaled by $-1$. Moreover, the transformed vectors $A\bar{v}_1,\ A\bar{v}_2$ form the major and minor axes of the ellipse $A\bar{x}$.
Recall the expression from linear algebra: $$A\bar{v}=\lambda\bar{v}$$
We observe that the transformed eigenvectors $A\bar{v}_1,\ A\bar{v}_2$ are scaled by factors corresponding exactly to their respective eigenvalues (i.e. $\lambda_1\bar{v}_1,\ \lambda_2\bar{v}_2$ respectively)!
Let’s bring everything together in one plot…
Eigenvalues and Eigenvectors: Takeaways
To summarize the intuition behind a geometric perspective on eigenvectors and eigenvalues:
Eigenvectors $\bar{v}_i$ belonging to a matrix $A$ are directionally unaffected by the transformation $A\bar{v}_i$
The transformation $A\bar{v}_i$ alters the magnitude of $\bar{v}_i$ by a factor corresponding to its eigenvalue $\lambda_i$.
Eigenvectors are orthogonal and form the major and minor axes of the ellipse $Ax\in\mathbb{R}^2$ (this concept extends to higher dimensions). Hence, eigenvectors and eigenvalues provide valuable insights into how $A$ modifies data, from $x$ to $x\rightarrow Ax$.
The transformation $A\bar{v}_i$ alters the length of $\bar{v}_i$ by a factor corresponding to its eigenvalue $\lambda_i$
Eigenvectors are formed to be orthonormal and form the major and minor axes of the ellipse $Ax\in\mathbb{R}^2$ (this concept extends to higher dimensions). Thus, eigenvectors and eigenvalues provide us with valuable insight into how $A$ alters data $x\rightarrow Ax$.