← Module 7

Module 7 Study Guide: Inner Product Spaces

Linear Algebra -- Learn Without Walls

1. Dot Product and Length

u · v = u1v1 + u2v2 + ... + unvn = u^T v
||u|| = sqrt(u · u) | Unit vector: u-hat = u/||u|| | dist(u,v) = ||u - v||
cos(theta) = (u · v)/(||u|| ||v||). Cauchy-Schwarz: |u · v| ≤ ||u|| ||v||.

2. Orthogonality

Orthogonal: u ⊥ v iff u · v = 0. Orthogonal sets of nonzero vectors are automatically linearly independent.
Orthogonal complement: W-perp = {v : v · w = 0 for all w in W}. dim(W) + dim(W-perp) = n.

3. Orthogonal Projections

proj_u(y) = [(y · u)/(u · u)] u
proj_W(y) = sum of [(y · ui)/(ui · ui)] ui over orthogonal basis {u1,...,uk}
Orthogonal Decomposition: y = proj_W(y) + z where z is in W-perp. proj_W(y) is the closest point in W to y.

4. Gram-Schmidt Process

Algorithm: v1 = x1. For k ≥ 2: vk = xk - sum of projections onto v1, ..., v_{k-1}.
Produces orthogonal basis for same subspace. Normalize for orthonormal basis. Leads to QR factorization: A = QR.

5. Least Squares

Normal equations: A^T A x-hat = A^T b
Least squares solution x-hat minimizes ||b - Ax||. Geometrically: Ax-hat = proj_{col(A)}(b).
Best-fit line: A = [1 t1; ...; 1 tn], b = (y1,...,yn). Solve for (intercept, slope). Extends to polynomial and multivariate fits.
ProblemSetup
Best-fit line y = c0 + c1*tA has columns [1s | t-values]
Best-fit parabola y = c0 + c1*t + c2*t^2A has columns [1s | t | t^2]