Module 7 Quiz: Inner Product Spaces
Quiz
10 questions on dot products, orthogonality, projections, Gram-Schmidt, and least squares.
1
Compute (2, -1, 3) · (1, 4, 2).
2 - 4 + 6 = 4.
2
When are two vectors orthogonal?
When their dot product equals zero: u · v = 0.
3
State the Cauchy-Schwarz inequality.
|u · v| ≤ ||u|| ||v|| for all vectors u, v. Equality iff u and v are parallel.
4
Project y = (3, 5) onto u = (1, 0).
proj_u(y) = [(3)/1](1,0) = (3, 0). The x-component.
5
What does the Orthogonal Decomposition Theorem say?
Every vector y = y-hat + z where y-hat is in W (y-hat = proj_W(y)) and z is in W-perp. This decomposition is unique.
6
In Gram-Schmidt, what do you do at each step?
Subtract the projections of the new vector onto all previously computed orthogonal vectors. What remains is orthogonal to all of them.
7
Write the normal equations for least squares.
A^T A x-hat = A^T b.
8
What does the least squares solution minimize?
It minimizes ||b - Ax||, the distance from b to the column space of A (equivalently, the sum of squared residuals).
9
If Q has orthonormal columns, what is Q^T Q?
Q^T Q = I (the identity matrix).
10
For the best-fit line y = c0 + c1*t, what is the design matrix A?
A = [1 t1; 1 t2; ...; 1 tn] where t1, ..., tn are the input values. The first column is all 1s (for the intercept), the second column is the t values.