Lesson 2: Orthogonal Projections
Estimated time: 40-50 minutes
Learning Objectives
- Compute the projection of a vector onto a line (one-dimensional subspace)
- Compute the projection of a vector onto a subspace with an orthogonal basis
- State and apply the Orthogonal Decomposition Theorem
- Understand the geometry of projection as finding the closest point
Projection onto a Line
The projection of y onto a line through the origin in the direction of u gives the component of y in the direction of u.
Projection onto a vector: proj_u(y) = [(y · u) / (u · u)] * u.
The scalar (y · u)/(u · u) is called the scalar projection or component of y along u.
Worked Example 1
Project y = (4, 2) onto u = (3, 0).
proj_u(y) = [(4*3 + 2*0)/(9+0)] * (3, 0) = (12/9) * (3, 0) = (4/3)(3, 0) = (4, 0).
This makes geometric sense: the projection onto the x-axis drops the y-component.
Worked Example 2
Project y = (1, 3) onto u = (1, 1).
proj_u(y) = [(1+3)/(1+1)] * (1, 1) = (4/2)(1, 1) = (2, 2).
Residual: y - proj_u(y) = (1, 3) - (2, 2) = (-1, 1). Check: (-1, 1) · (1, 1) = -1 + 1 = 0. The residual is orthogonal to u.
The Orthogonal Decomposition Theorem
Orthogonal Decomposition: If W is a subspace of R^n, then every vector y in R^n can be uniquely written as:
y = y-hat + z, where y-hat is in W and z is in W-perp (the orthogonal complement of W).
y-hat = proj_W(y) is the closest vector in W to y.
Projection onto a Subspace
If {u1, u2, ..., uk} is an orthogonal basis for W, then:
Projection onto subspace W:
proj_W(y) = [(y · u1)/(u1 · u1)] u1 + [(y · u2)/(u2 · u2)] u2 + ... + [(y · uk)/(uk · uk)] uk
Worked Example 3
W = span{u1, u2} where u1 = (1, 0, 1) and u2 = (0, 1, 0). Project y = (2, 3, 4) onto W.
Check: u1 · u2 = 0. Good, they are orthogonal.
proj_W(y) = [(2+0+4)/2](1,0,1) + [(0+3+0)/1](0,1,0) = 3(1,0,1) + 3(0,1,0) = (3, 0, 3) + (0, 3, 0) = (3, 3, 3).
Residual: y - proj_W(y) = (2,3,4) - (3,3,3) = (-1, 0, 1).
Check: (-1,0,1) · (1,0,1) = -1+1 = 0 and (-1,0,1) · (0,1,0) = 0. Residual is in W-perp.
Closest Point Interpretation
proj_W(y) is the point in W that minimizes ||y - w|| over all w in W. This is the foundation of least-squares problems (Lesson 4).
Worked Example 4
Find the closest point in W = span{(1, 1, 1)} to y = (1, 5, 3).
proj_W(y) = [(1+5+3)/3](1,1,1) = (9/3)(1,1,1) = (3, 3, 3).
Distance: ||y - proj_W(y)|| = ||(-2, 2, 0)|| = sqrt(4+4) = 2*sqrt(2).
The Orthogonal Complement
Orthogonal Complement: W-perp = {v in R^n : v · w = 0 for all w in W}.
Key facts: dim(W) + dim(W-perp) = n. (W-perp)-perp = W. (Row space)-perp = Null space.
Worked Example 5
W = span{(1, 2, 3)}. Find W-perp.
W-perp = {v : v · (1,2,3) = 0} = {(v1,v2,v3) : v1 + 2v2 + 3v3 = 0}.
This is a plane through the origin. Basis: {(-2, 1, 0), (-3, 0, 1)}. dim(W) = 1, dim(W-perp) = 2, total = 3.
Check Your Understanding
1. Project y = (6, 2) onto u = (1, 1).
2. What is the residual from the projection in Q1?
3. If W is a 2-dimensional subspace of R^5, what is dim(W-perp)?
Key Takeaways
- Projection onto a line: proj_u(y) = [(y · u)/(u · u)] u
- Projection onto subspace: sum of projections onto orthogonal basis vectors
- y = proj_W(y) + z, where z is in W-perp (orthogonal decomposition)
- proj_W(y) is the closest point in W to y
- dim(W) + dim(W-perp) = n