Matrix Decomposition
What
Breaking a matrix into simpler component matrices that, multiplied together, reconstruct the original. Like prime factorization but for matrices.
Why it matters
- SVD (Singular Value Decomposition): used in recommendation systems, LSA, image compression
- PCA: eigendecomposition of covariance matrix → dimensionality reduction
- NMF (Non-negative Matrix Factorization): topic modeling, source separation
- Solving linear systems: LU decomposition makes Ax = b efficient
Key ideas
SVD: A = U × Σ × Vᵀ
- U = left singular vectors (row patterns)
- Σ = diagonal matrix of singular values (importance of each pattern)
- Vᵀ = right singular vectors (column patterns)
- Truncating to top-k singular values = best rank-k approximation
When to use what
| Method | Use case |
|---|---|
| SVD | Dimensionality reduction, recommenders, denoising |
| Eigendecomposition | PCA, spectral methods |
| NMF | Topic modeling (non-negative data) |
| QR/LU | Solving linear systems efficiently |
In NumPy
import numpy as np
A = np.random.randn(100, 50)
U, S, Vt = np.linalg.svd(A, full_matrices=False)
# Low-rank approximation (keep top 10 components)
k = 10
A_approx = U[:, :k] @ np.diag(S[:k]) @ Vt[:k, :]