Dot Product

What

Take two vectors of the same length, multiply element-wise, sum the results.

a = [1, 2, 3]
b = [4, 5, 6]
a · b = 1×4 + 2×5 + 3×6 = 32

Why it matters

  • Similarity: dot product measures how aligned two vectors are
  • Cosine similarity: dot product normalized by magnitudes — core of embeddings and search
  • Attention mechanism: queries dot keys to find relevant tokens
  • Linear layers: each neuron computes a dot product of input with its weights

Key ideas

  • Geometric interpretation: a · b = |a| × |b| × cos(θ)
    • Positive → vectors point same-ish direction
    • Zero → perpendicular (orthogonal)
    • Negative → opposite directions
  • Cosine similarity: a · b / (|a| × |b|) — ranges from -1 to 1

In NumPy

import numpy as np
 
a = np.array([1, 2, 3])
b = np.array([4, 5, 6])
 
np.dot(a, b)       # 32
a @ b              # 32 (same thing for 1D vectors)
 
# cosine similarity
cos_sim = np.dot(a, b) / (np.linalg.norm(a) * np.linalg.norm(b))