Eigenvalues and Eigenvectors

What

For a square matrix A, an eigenvector v is a direction that doesn’t change when A is applied — it only gets scaled by a factor λ (the eigenvalue).

A @ v = λ × v

Why it matters

  • PCA: principal components are eigenvectors of the covariance matrix — they point in the directions of maximum variance
  • Spectral clustering: uses eigenvalues of the graph Laplacian
  • Stability analysis: eigenvalues tell you if a system converges or diverges
  • PageRank: the ranking vector is the dominant eigenvector of the link matrix

Key ideas

  • Each square matrix has n eigenvalue-eigenvector pairs
  • Eigenvectors are the “natural axes” of the transformation
  • Eigenvalues tell you how much each axis stretches or shrinks
  • Large eigenvalue = important direction, small = noise (PCA intuition)

In NumPy

import numpy as np
 
A = np.array([[4, 2], [1, 3]])
eigenvalues, eigenvectors = np.linalg.eig(A)
 
# eigenvalues: scaling factors
# eigenvectors: columns are the eigenvectors