Learning Representations by Back-Propagating Errors

Rumelhart, Hinton, Williams (1986)

Read paper

Why It Matters

Formalized backpropagation for multi-layer networks. The foundational algorithm that makes all modern deep learning possible.

Key Ideas

  1. Multi-layer neural networks can be trained efficiently by propagating error gradients backward through hidden layers.
  2. Hidden units can learn useful internal representations rather than relying on fixed hand-crafted features.
  3. Backpropagation turns deep representation learning into a workable optimization procedure.
  4. This is one of the conceptual foundations of modern deep learning because it makes layered learning practical.

Notes

  • The paper’s importance is methodological: once gradients can train internal layers, deeper supervised models become viable.
  • Later deep learning progress depended on this training idea plus more data, better hardware, and improved optimization.