Math 114: Linear Algebra
Orthogonal Projections
Recall: Given a subspace W of a vector space V ,
1. dim(W ) + dim(W ) = n
2. If {v1 , . . . , vk } is a basis for W and {vk+1 , . . . , vn } is a basis for W , then {v1 , . . . , vn } is a basis for V .
If b V , then there exist x1 , . . . , xn
b = x1 v1 + + xk vk + xk+1 vk+1 + + xn vn
| {z } | {z }
W W
If in addition {v1 , . . . , vk } is an orthogonal basis for W , then
hb, v1 i hb, vk i
b= v1 + + vk + xk+1 vk+1 + + xn vn
hv1 , v1 i hvk , vk i | {z }
| {z } W
W
Theorem: Let W be a subspace of V . Then for any b V ,
(a) b = b1 + b2 for a unique pair b1 W and b2 W
(b) b1 is the (unique) vector in W nearest (minimum distance) to b. We call b1 the projection of b onto
W and write b1 = projW b.
General Method Suppose {u1 , . . . , uk } is a basis for W . Then
projW b = a1 u1 + a2 u2 + + ak uk
if and only if b (a1 u1 + a2 u2 + + ak uk ) W
if and only if hb (a1 u1 + a2 u2 + + ak uk ), uj i = 0 for all j = 1, . . . , k.
Special Method 1: Suppose {v1 , . . . , vk } is an orthogonal basis for W . Then
hb, v1 i hb, v2 i hb, vk i
projW b = v1 + v2 + + vk
hv1 , v1 i hv2 , v2 i hvk , vk i
Special Method 2: Suppose the columns of A form a spanning set for W . Then
projW b = Ax AT Ax = AT b
Examples: For each given W and b, find projW v.
(" #) " #
1 2
1. Suppose W = Span and b = .
0 5
(" #) " #
1 2
2. Suppose W = Span and b = .
1 5
2
1
1
3. Let W = Span 2 , 1 and b = 1.
1 4 1
2
1
1
4. Let W = Span 2 , 1 and b = 1.
1 4 0
2 4 2 1 1 0 9 0 1
5. Let A = 2 5 7 3 0 1 5 0. Suppose W = Col(A) and b = 2.
3 7 8 6 0 0 0 1 1
projW v = v projW v
1
v W projW v = v
v W projW v = 0
Examples: For each item, (a) verify that the given basis is orthogonal, (b) find the orthogonal projection of
y onto W ; and (c) find the distance of y from W .
1 2 2
1. Let y = 2 and W = Span{u1 , u2 }, where u1 = 5 and u2 = 1 .
3 1 1
9 7 1
2. Let y = 1 and W = Span{u1 , u2 }, where u1 = 1 and u2 = 1 .
6 4 2
Math 114: Linear Algebra
Gram-Schmidt Orthogonalization Process
Input: basis {u1 , . . . , uk } for W
Output: orthogonal basis {v1 , . . . , vk } for W .
Step 1: Let v1 = u1 and W1 = Span{v1 }
Step 2: Let v2 = u2 projW1 u2 and W2 = Span{v1 , v2 }
..
.
Step i: Let vi = ui projWi1 ui and Wi = Span{v1 , , vi }
..
.
Step k: Let vk = uk projWk1 uk
Optional: To make the set orthonormal, just divide each vector vi by its norm ||vi ||. It is easier to normalize after
orthogonalizing the whole set rather than normalizing each vector found after each step in the algorithm above.
Note: If {u1 , . . . , uk } is just a spanning set (not linearly independent) to begin with, this process will produce zero
vectors for some of the vi s. You can still obtain an orthogonal basis by removing these zero vectors from the set
{v1 , . . . , vk }.
Examples: Find an orthogonal basis for the following subspaces.
3
1
1. W = Span 6 , 2
0 2
1 0 0
1 1 0
2. W = Span , ,
1 1 1
1 1 1
1 0 9 0
3. W = N ul 0 1 5 0.
0 0 0 1
1 2 0 1 3
4. W = N ul 0 0 1 2 2
0 0 0 0 0
2
2 4 2 1
5. W = Col 2 5 7 3
3 7 8 6
The QR Factorization of a Matrix
If A is an m n matrix with linearly independent columns, then A can be written as A = QR, where Q is an m n
matrix whose columns form an orthonormal basis for Col(A) and R is an n n upper triangular invertible matrix
with positive entries on its diagonals.
Examples:
1
3 1
5
0 " #
3 5 5
1. A = 6 2 = 2 0
5 0 2
0 2 0 1
1 3
1 0 0 0
12 2 2 3
1
3 6 2
1 1 0
2. A = = 2 6 3 0
3 3
1 1 3 6 2 3
1 1
2
6 6 0 0 6
1 3 6 3
1 1 1 2 6 6
Note: In Matlab, type [Q,R]=qr(A) to obtain the matrices Q and R that implement this decomposition.