Learning Representations by Back-Propagating Errors
Rumelhart, Hinton, Williams (1986)
Why It Matters
Formalized backpropagation for multi-layer networks. The foundational algorithm that makes all modern deep learning possible.
Key Ideas
- Multi-layer neural networks can be trained efficiently by propagating error gradients backward through hidden layers.
- Hidden units can learn useful internal representations rather than relying on fixed hand-crafted features.
- Backpropagation turns deep representation learning into a workable optimization procedure.
- This is one of the conceptual foundations of modern deep learning because it makes layered learning practical.
Notes
- The paper’s importance is methodological: once gradients can train internal layers, deeper supervised models become viable.
- Later deep learning progress depended on this training idea plus more data, better hardware, and improved optimization.