Homework 2 Solutions
Homework 2 Solutions
Proof. To find t for which the list (3, 1, 4), (2, −3, 5), (5, 9, t) is linearly independent, we can write the last vector (5, 9, t) as a
linear combination of the first two vectors (3, 1, 4), (2, −3, 5) as follows:
(5, 9, t) = a1 (3, 1, 4) + a2 (2, −3, 5)
for some a1, a2 ∈ F and solve for t. To do this, we can rewrite the above equation as
(5, 9, t) = (3a1 + 2a2, a1 − 3a2, 4a1 + 5a2 ),
from which we can equate the coordinates of both sides to obtain the system of equations
5 = 3a1 + 2a2,
9 = a1 − 3a2,
t = 4a1 + 5a2 .
The first two of the three equations in the above system can be system-solved to get a1 = 3, a2 = −2. Substituting these values
into the third equation, we get t = 4(3) + 5(2) = 22.
2.A.5. (a) Show that if we think of C as a vector space over R, then the list (1 + i, 1 − i) is linearly independent.
Proof. If C as a vector space over R, then the scalars are real numbers. Suppose a1, a2 ∈ R satisfy
a1 (1 + i) + a2 (1 − i) = 0 + 0i.
Then using the definition of addition on C for the left-hand side of the above equation, we get
(a1 + a2 ) + (a1 − a2 )i = 0 + 0i,
from which we can equate the terms from both sides to obtain the system of equations
a1 + a2 = 0,
a1 − a2 = 0.
The only pair of solutions in R to this system of equations is a1 = 0, a2 = 0. Therefore, the list (1 + i, 1 − i) is linearly
independent.
(b) Show that if we think of C as a vector spae over C, then the list (1 + i, 1 − i) is linearly dependent.
Proof. If C as a vector space over C, then the scalars are complex numbers. Choose c1 = i, c2 = 1 ∈ C. Then c1, c2
satisfy
c1 (1 + i) + c2 (1 − i) = i(1 + i) + 1(1 − i)
= (i + i 2 ) + (1 − i)
= (i − 1) + (1 − i)
= 0.
Since c1 = 1, c2 = 1 are nonzero scalars, we conclude that the list (1 + i, 1 − i) is linearly dependent.
2.A.6. Suppose v1, v2, v3, v4 is a linearly independent in V. Prove that the list
v1 − v2, v2 − v3, v3 − v4, v4
is also linearly independent.
2.A.7. Prove or give a counterexample: If v1, v2, v3, . . . , vm is a linearly independent list of vectors in V, then
5v1 − 4v2, v2, v3, . . . , vm
is linearly independent.
Since v1, . . . , vm is linearly independent, all scalars are zero, which means we have
a1 λ = 0, . . . , am λ = 0.
2.A.9. Prove or give a counterexample: If v1, . . . , vm and w1, . . . , wm are also linearly independent lists of vectors in V, then v1 +
w1, . . . , vm + wm is linearly independent.
Proof. We will give a counterexample to show that this statement is false. Let m = 2, let V = R2 , let v1 = (1, 0), v2 = (0, 1) be
a list of vectors in R2 , and let w1 = −v1 = (−1, 0) and w2 = −v2 = (0, −1). Suppose a1, a2 satisfy
a1 v1 + a2 v2 = (0, 0).
Then we have
(0, 0) = a1 v1 + a2 v2
= a1 (1, 0) + a2 (0, 1)
= (a1, 0) + (0, a2 )
= (a1, a2 ),
(0, 0) = b1 w1 + b2 w2
= b1 (−1, 0) + b2 (0, −1)
= (−b1, 0) + (0, −b2 )
= (−b1, −b2 ),
from which we get −b1 = 0, −b2 = 0, or equivalently b1 = 0, b2 = 0, and so w1, w2 is linearly independent. However, if we
choose c1 = 1, c2 = 1, then we have
2.A.10. Suppose v1, . . . , vm is linearly dependent in V and w ∈ V. Prove that if v1 + w, . . . , vm + w is linearly dependent, then
w ∈ span(v1, . . . , vm ).
Proof. Suppose v1 + w, . . . , vm + w is linearly independent. Then there exist scalars a1, . . . , am , not all zero, that satisfy
a1 (v1 + w) + · · · + am (vm + w) = 0.
We can algebraically rearrange the terms in the left-hand side of the above equation to get
a1 v1 + · · · + am vm + (a1 + · · · + am )w = 0.
We claim that, if a1, . . . , am are not all zero, then we have a1 + · · · + am , 0. To prove this claim, suppose by contradiction
that we have a1 + · · · + am = 0. Then the last equation reduces to
a1 v1 + · · · + am vm = 0.
According to the premises, v1, . . . , vm is linearly independent in V, which means all the scalars are zero:
a1 = 0, . . . , am = 0,
which contradicts our earlier result saying that not all the scalars a1, . . . , am are zero. Therefore, we proved our claim, and so
we have a1 + · · · + am , 0. Therefore, we can use the above equation
a1 v1 + · · · + am vm + (a1 + · · · + am )w = 0.
to obtain
a1 am
w= − v1 + · · · + − vm .
a1 + · · · + am a1 + · · · + am
a1 am
Since we have − a1 +···+a m
, . . . , − a1 +···+a m
∈ F, we conclude w is a linear combination of the vectors v1, . . . , vm , and so we
have w ∈ span(v1, . . . , vm ).
2.A.11. Suppose v1, . . . , vm is linearly independent in V and w ∈ V. Show that v1, . . . , vm, w is linearly independent if and only if
w < span(v1, . . . , vm ).
Proof. Forward direction: If v1, . . . , vm, w is linearly independent, then w < span(v1, . . . , vm ). Suppose v1, . . . , vm, w is linearly
independent, and suppose by contradiction that we have w ∈ span(v1, . . . , vm ). Then there exist a1, . . . , am ∈ F that satisfy
w = a1 v1 + · · · + am vm .
a1 v1 + · · · + am vm + bw = 0.
At this point, we will continue our argument by breaking down into separate cases: b = 0 and b , 0.
• Case 1: Suppose b = 0. Then the equation
a1 v1 + · · · + am vm + bw = 0
reduces to
a1 v1 + · · · + am vm = 0.
Since we assumed in the premises that v1, . . . , vm is linearly independent in V, all the scalars are zero:
a1 = 0, . . . , am = 0.
a1 = 0, . . . , am = 0, b = 0,
which means v1, . . . , vm, w is linearly independent, which contradicts our assumption that v1, . . . , vm, w is linearly de-
pendent.
• Case 2: Suppose b , 0. Then we can solve the equation
a1 v1 + · · · + am vm + bw = 0
to get a a
1 m
w= − v1 + · · · + − vm .
b b
Since we have − ab1 , . . . , − abm ∈ F, we conclude that w is a linear combination of the vectors v1, . . . , vm , and so we have
w ∈ span(v1, . . . , vn ). But this contradicts our assumption w < span(v1, . . . , vn ).
Therefore, in either case of b = 0 or b , 0, we achieve a contradiction, and so we conclude that v1, . . . , vm, w is linearly
independent.
Find a basis of U.
Proof. Let (x1, x2, x3, x4, x5 ) ∈ U be arbitrary. Then we have x1 = 3x2 and x3 = 7x4 , and so we can write
Since we have x2, x4, x5 ∈ R, we have established that the list (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1) spans U. If we can
also show that the list is also linearly independent in U, then it would in fact be a basis of U. Suppose a1, a2, a3 ∈ R
satisfy
a1 (3, 1, 0, 0, 0) + a2 (0, 0, 7, 1, 0) + a3 (0, 0, 0, 0, 1) = (0, 0, 0, 0, 0).
Applying addition and scalar multiplication in R5 to the left-hand side of the above equation, we get
from which we can equate the second, fourth, and fifth coordinates of both sides to obtain
a1 = 0, a2 = 0, a3 = 0,
and so the list (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1) is linearly independent in U. So this list is a basis of U.
(b) Extend the basis in part (a) to a basis of R5 .
Proof. Adjoin the vectors (1, 0, 0, 0, 0), (0, 1, 0, 0, 0) to the basis (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1) of U in order to
form the list (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1), (0, 1, 0, 0, 0), (0, 0, 1, 0, 0) in R5 . We need to show that this resulting list
is in fact a basis of R5 . We need to show that this list is linearly independent. Suppose a1, a2, a3, a4, a5 ∈ R5 satisfy
Applying addition and scalar multiplication in R5 to the left-hand side of the above equation, we get
3a1 = 0, a1 + a4 = 0, 7a2 + a5 = 0, a2 = 0, a3 = 0.
The first equation 3a1 = 0 implies a1 = 0, the second equation with a1 = 0 implies a4 = 0, and the third equation
7a2 + a5 = 0 with a2 = 0 implies a5 = 0. Therefore, we have
a1 = 0, a2 = 0, a3 = 0, a4 = 0, a5 = 0,
and so the list (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1), (0, 1, 0, 0, 0), (0, 0, 1, 0, 0) is linearly independent in R5 . Furthermore,
since (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1), (0, 1, 0, 0, 0), (0, 0, 1, 0, 0) has length 5 and we have dim R5 = 5, it is of the
right length, which means, by 2.39 of Axler, this list is a basis of R5 .
(c) Find a subspace W of R5 such that R5 = U ⊕ W.
Proof. Following the proof of 2.34 of Axler, let W = span((0, 1, 0, 0, 0), (0, 0, 1, 0, 0)). Then, by 2.7 of Axler, W is a sub-
space of R5 . To prove R5 = U ⊕W, we need to show R5 = U +W and U ∩W = {(0, 0, 0, 0, 0)}, according to 1.45 of Axler.
To prove R5 = U + W, let (x1, x2, x3, x4, x5 ) ∈ R5 be a vector. We need to show that (x1, x2, x3, x4, x5 ) is a sum of a vector
in U and a vector in W. By part (b), we have the basis (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1), (0, 1, 0, 0, 0), (0, 0, 1, 0, 0) of
V, which means it is a list that spans V. So there exist a1, a2, a3, b1, b2 ∈ R that satisfy
where
u = a1 (3, 1, 0, 0, 0) + a2 (0, 0, 7, 1, 0) + a3 (0, 0, 0, 0, 1)
and
w = b1 (0, 1, 0, 0, 0) + b2 (0, 0, 1, 0, 0).
Since we have (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1) ∈ U and, as subspaces, U and W are closed under addition in V,
we have u = a1 (3, 1, 0, 0, 0) + a2 (0, 0, 7, 1, 0) + a3 (0, 0, 0, 0, 1) ∈ U and w = b1 (0, 1, 0, 0, 0) + b2 (0, 0, 1, 0, 0) ∈ W. So
we conclude (x1, x2, x3, x4, x5 ) ∈ U + W, and so R5 ⊂ U + W. However, according to 1.39 of Axler, U + W is a
subspace of R5 . So we must have the set equality R5 = U + W. We are now left to prove U ∩ W = {(0, 0, 0, 0, 0)}.
Suppose we have (x1, x2, x3, x4, x5 ) ∈ U ∩ W. Then we have (x1, x2, x3, x4, x5 ) ∈ U and (x1, x2, x3, x4, x5 ) ∈ W. According
to our proof of part (a), the list (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1) is a basis of U, and so it spans U. Since we
originally let W = span((0, 1, 0, 0, 0), (0, 0, 1, 0, 0)), by this construction the list (0, 1, 0, 0, 0), (0, 0, 1, 0, 0) spans W. So
(x1, x2, x3, x4, x5 ) ∈ R5 can be written as a linear combination of the vectors in the two lists. In other words, there exist
scalars a1, a2, a3, b1, b2 ∈ R that saitsfy
and
(x1, x2, x3, x4, x5 ) = b1 (0, 1, 0, 0, 0) + b2 (0, 0, 1, 0, 0).
Equating the two equations, we get
Since, according to our proof of part (b), the list (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1), (0, 1, 0, 0, 0), (0, 0, 1, 0, 0) is a basis
of R5 , it is linearly independent in R5 . So all the scalars are zero; that is, we have
a1 = 0, a2 = 0, a3 = 0, b1 = 0, b2 = 0.
Therefore, we have
Therefore, we have U ∩ W ⊂ {(0, 0, 0, 0, 0)}. As subspaces, U and W contain the zero vector; that is, we have
(0, 0, 0, 0, 0) ∈ U and (0, 0, 0, 0, 0) ∈ W. So we have (0, 0, 0, 0, 0) ∈ U ∩ W, and so {(0, 0, 0, 0, 0)} ⊂ U ∩ W. There-
fore, we obtain the set equality U ∩ W = {(0, 0, 0, 0, 0)}. So we established R5 = V + W and V ∩ W = {(0, 0, 0, 0, 0)}. By
1.45 of Axler, we conclude R5 = U ⊕ W.
Find a basis of U.
Proof. Let (z1, z2, z3, z4, z5 ) ∈ U be arbitrary. Then we have 6z1 = z2 and z3 + 2z4 + z5 = 0, and so we can write
Since we have z1, z4, z5 ∈ R, we have established that the list (1, 6, 0, 0, 0), (0, 0, −2, 1, 0), (0, 0, 0, −1, 1) spans U. If we can
also show that the list is also linearly independent in U, then it would in fact be a basis of U. Suppose a1, a2, a3 ∈ C
satisfy
a1 (1, 6, 0, 0, 0) + a2 (0, 0, −2, 1, 0) + a3 (0, 0, 0, −1, 1) = (0, 0, 0, 0, 0).
Applying addition and scalar multiplication in R5 to the left-hand side of the above equation, we get
from which we can equate the first, thid, and fifth coordinates of both sides to obtain
a1 = 0, −2a2 = 0, a3 = 0.
Applying addition and scalar multiplication in C5 to the left-hand side of the above equation, we get
a1 = 0, 6a1 + a4 = 0, −2a3 + a5 = 0, a2 − a3 = 0, a3 = 0.
The second equation 6a1 + a4 = 0 with a1 = 0 implies a4 = 0, the third equation with a3 = 0 implies a5 = 0, and the
fourth equation a2 − a3 = 0 with a3 = 0 implies a2 = 0. Therefore, we have
a1 = 0, a2 = 0, a3 = 0, a4 = 0, a5 = 0,
and so the list (1, 6, 0, 0, 0), (0, 0, −2, 1, 0), (0, 0, 0, −1, 1), (0, 1, 0, 0, 0), (0, 0, 1, 0, 0) is linearly independent in C5 . Further-
more, since (1, 6, 0, 0, 0), (0, 0, −2, 1, 0), (0, 0, 0, −1, 1), (0, 1, 0, 0, 0), (0, 0, 1, 0, 0) has length 5 and we have dim C5 = 5, it is
of the right length, which means, by 2.39 of Axler, this list is a basis of C5 .
(c) Find a subspace W of C5 such that C5 = U ⊕ W.
Proof. Following the proof of 2.34 of Axler, let W = span((0, 1, 0, 0, 0), (0, 0, 1, 0, 0)). Then, by 2.7 of Axler, W is a sub-
space of C5 . To prove C5 = U ⊕W, we need to show C5 = U +W and U ∩W = {(0, 0, 0, 0, 0)}, according to 1.45 of Axler.
To prove C5 = U + W, let (z1, z2, z3, z4, z5 ) ∈ R5 be a vector. We need to show that (z1, z2, z3, z4, z5 ) is a sum of a vector
in U and a vector in W. By part (b), we have the basis (1, 6, 0, 0, 0), (0, 0, −2, 1, 0), (0, 0, 0, −1, 1), (0, 1, 0, 0, 0), (0, 0, 1, 0, 0)
of V, which means it is a list that spans V. So there exist a1, a2, a3, b1, b2 ∈ C that satisfy
where
u = a1 (1, 6, 0, 0, 0) + a2 (0, 0, −2, 1, 0) + a3 (0, 0, 0, −1, 1)
and
w = b1 (0, 1, 0, 0, 0) + b2 (0, 0, 1, 0, 0).
Since we have (1, 6, 0, 0, 0), (0, 0, −2, 1, 0), (0, 0, 0, −1, 1) ∈ U and, as subspaces, U and W are closed under addition in
V, we have u = a1 (1, 6, 0, 0, 0) + a2 (0, 0, −2, 1, 0) + a3 (0, 0, 0, −1, 1) ∈ U and w = b1 (0, 1, 0, 0, 0) + b2 (0, 0, 1, 0, 0) ∈ W.
So we conclude (z1, z2, z3, z4, z5 ) ∈ U + W, and so C5 ⊂ U + W. However, according to 1.39 of Axler, U + W is a
subspace of R5 . So we must have the set equality C5 = U + W. We are now left to prove U ∩ W = {(0, 0, 0, 0, 0)}.
Suppose we have (x1, x2, x3, x4, x5 ) ∈ U ∩ W. Then we have (x1, x2, x3, x4, x5 ) ∈ U and (x1, x2, x3, x4, x5 ) ∈ W. According
to our proof of part (a), the list (3, 1, 0, 0, 0), (0, 0, 7, 1, 0), (0, 0, 0, 0, 1) is a basis of U, and so it spans U. Since we
originally let W = span((0, 1, 0, 0, 0), (0, 0, 1, 0, 0)), by this construction the list (0, 1, 0, 0, 0), (0, 0, 1, 0, 0) spans W. So
(z1, z2, z3, z4, z5 ) ∈ C5 can be written as a linear combination of the vectors in the two lists. In other words, there exist
scalars a1, a2, a3, b1, b2 ∈ C that saitsfy
and
(z1, z2, z3, z4, z5 ) = b1 (0, 1, 0, 0, 0) + b2 (0, 0, 1, 0, 0).
Equating the two equations, we get
Since, according to our proof of part (b), the list (1, 6, 0, 0, 0), (0, 0, −2, 1, 0), (0, 0, 0, −1, 1), (0, 1, 0, 0, 0), (0, 0, 1, 0, 0) is a
basis of C5 , it is linearly independent in C5 . So all the scalars are zero; that is, we have
a1 = 0, a2 = 0, a3 = 0, b1 = 0, b2 = 0.
Therefore, we have
Therefore, we have U ∩ W ⊂ {(0, 0, 0, 0, 0)}. As subspaces, U and W contain the zero vector; that is, we have
(0, 0, 0, 0, 0) ∈ U and (0, 0, 0, 0, 0) ∈ W. So we have (0, 0, 0, 0, 0) ∈ U ∩ W, and so {(0, 0, 0, 0, 0)} ⊂ U ∩ W. There-
fore, we obtain the set equality U ∩ W = {(0, 0, 0, 0, 0)}. So we established C5 = V + W and V ∩ W = {(0, 0, 0, 0, 0)}. By
1.45 of Axler, we conclude C5 = U ⊕ W.
2.B.6. Suppose v1, v2, v3, v4 is a basis of V. Prove that the list
is also a basis of V.
Proof. Suppose a1, a2, a3, a4 ∈ F satisfy
Since v1, v2, v3, v4 is linearly independent, all scalars are zero, which means we have
a1 = 0, a1 + a2 = 0, a2 + a3 = 0, a3 + a4 = 0.
The second equation a1 + a2 = 0 with a1 = 0 implies a2 = 0. The second equation a2 + a3 = 0 with a2 = 0 implies a3 = 0.
The third equation a3 + a4 = 0 with a3 = 0 implies a4 = 0. So we have
a1 = 0, a2 = 0, a3 = 0, a4 = 0,
and so we conclude v1 + v2, v2 + v3, v3 + v4, v4 is linearly independent. Next, we need to prove that v1 + v2, v2 + v3, v3 + v4, v4
spans V. Since v1, v2, v3, v4 spans V, there exist a1, a2, a3, a4 ∈ F such that
v = a1 v1 + a2 v2 + a3 v3 + a4 v4 .
So we have
v = a1 v1 + a2 v2 + a3 v3 + a4 v4
= a1 ((v1 + v2 ) − (v2 + v3 ) − (v3 + v4 ) − v4 ) + a2 ((v2 + v3 ) − (v3 + v4 ) − v4 ) + a3 ((v3 + v4 ) − v4 ) + a4 v4
= a1 (v1 + v2 ) + (−a1 + a2 )(v + 2 + v3 ) + (−a1 − a2 + a3 )v3 + (−a1 − a2 − a3 + a4 )v4 .
Since we also have a1, −a1 + a2, −a1 − a2 + a3, −a1 − a2 − a3 + a4 ∈ F, it follows that the list v1 + v2, v2 + v3, v3 + v4, v4 spans
V. Therefore, v1 + v2, v2 + v3, v3 + v4, v4 is a basis of V.
2.B.8. Suppose U and W are subspaces of V such that V = U ⊕ W. Suppose also that u1, . . . , um is a basis of U and w1, . . . , wn is a
basis of W. Prove that
u1, . . . , um, w1, . . . , wn
is a basis of V.
Proof. First, we will show that u1, . . . , um, w1, . . . , wn is linearly independent. Suppose a1, . . . , am, b1, . . . , bn ∈ F satisfy
a1 u1 + · · · + am um + b1 w1 + · · · + bn wn = 0.
Since U and W are subspaces of V, in particular they are closed in addition, which means we have a1 u1 + · · · + am um ∈ U and
b1 w1 + · · · + bn wn ∈ W. But the above equation a1 u1 + · · · + am um + b1 w1 + · · · + bn wn = 0 implies that we also have
a1 u1 + · · · + am um = −(b1 w1 + · · · + bn wn ) ∈ W
and
b1 w1 + · · · + bn wn = −(a1 u1 + · · · + am um ) ∈ U,
since again U and W are subspaces, which means in particular that they are closed under scalar multiplication as well.
Altogether, we have
a1 u1 + · · · + am um, b1 w1 + · · · + bn wn ∈ U ∩ W .
Since we also assumed V = U ⊕ W, by 1.45 of Axler we have U ∩ W = {0}. So we get
a1 u1 + · · · + am um = 0
and
b1 w1 + · · · + bn wn = 0.
Since u1, . . . , um is a basis of U, it is linearly independent in U and spans U. In other words, a1 u1 + · · · + am um = 0 implies
a1 = 0, . . . , am = 0,
and every vector u ∈ U can be written
u = a1 u1 + · · · + am um
for some a1, . . . , am ∈ F. Similarly, since w1, . . . , wm is a basis of W, it is linearly independent in W and spans W. In other
words, b1 w1 + · · · + bn wn = 0 implies
b1 = 0, . . . , bn = 0,
and every vector u ∈ U can be written
u = b1 w1 + · · · + bn wn
for some b1, . . . , bn ∈ F. Therefore, a1 u1 + · · · + am um, b1 w1 + · · · + bn wn = 0 + 0 = 0 implies
a1 = 0, . . . , am = 0, b1 = 0, . . . , bn = 0,
and so u1, . . . , um, w1, . . . , wn is linearly independent in V. Also, every vector v ∈ V can be written
v =u+w
= (a1 u1 + · · · + am um ) + (b1 w1 + · · · + bn wn )
= a1 u1 + · · · + am um + b1 w1 + · · · + bn wn,
which means u1, . . . , um, w1, . . . , wn spans V. Therefore, u1, . . . , um, w1, . . . , wn is a basis of V.
2.C.1. Suppose V is finite-dimensional and U is a subspace of V such that dim U = dim V. Prove that we have U = V.
Proof. Let u1, . . . , un be a basis of U, which means we have n = dim U. This means u1, . . . , un spans U—that is, we have
span(u1, . . . , un ) = U—and is linearly independent in U. In fact, since U is a subspace of V, it is also true that u1, . . . , un is a
linearly independent list in V. Since we have dim U = dim V, it follows that we have dim V = n. So the linearly independent
list u1, . . . , un in V has length n and dim V = n. By 2.39 of Axler, u1, . . . , un is a basis of V. This means in particular that
u1, . . . , un spans V, which means we have span(u1, . . . , un ) = U. Therefore, we conclude U = V, as desired.
2.C.2. Show that the subspaces of R2 are precisely {0}, R2 , and all lines in R2 through the origin.
Proof. Let U be a subspace of R2 . Since 2.37 of Alxer asserts dim R2 = 2, it follows by 2.38 of Axler that dim U is one of
0, 1, 2. So we will argue by cases.
• Case 1: Suppose dim U = 0. Notice that the dimension of the trivial set is zero; that is, we have dim{0} = 0. Therefore,
we have dim U = dim{0}. By Exercise 2.C.1, we conclude U = {0}.
• Case 2: Suppose dim U = 1. Then the length of the basis of U is 1; in other words, the basis of U contains only one
nonzero vector. This means in particular the list contains one nonzero vector that spans all of U; every vector in U is a
scalar multiple (linear combination) of the one basis vector. The set of all such vectors in U describes a line in R2 ; in
other words, U is a line in R2 . Furthermore, as U is a subspace, in particular it contains the additive identity (0, 0) ∈ R2 .
Therefore, U must be a line in R2 that passes through the origin.
• Case 3: Suppose dim U = 2. Notice that we have dim R2 = 2. Therefore, we have dim U = dim R2 . By Exercise 2.C.1,
we must have U = R2 .
The cases complete our proof.
2.C.3. Show that the subspaces of R2 are precisely {0}, R3 , all lines in R2 through the origin, and all planes in R3 through the origin.
Proof. Let U be a subspace of R3 . Since 2.37 of Alxer asserts dim R2 = 3, it follows by 2.38 of Axler that dim U is one of
0, 1, 2, 3. So we will argue by cases.
• Case 1: Suppose dim U = 0. Notice that the dimension of the trivial set is zero; that is, we have dim{0} = 0. Therefore,
we have dim U = dim{0}. By Exercise 2.C.1, we conclude U = {0}.
• Case 2: Suppose dim U = 1. Then the length of the basis of U is 1; in other words, the basis of U contains only one
nonzero vector. This means in particular the list contains one nonzero vector that spans all of U; every vector in U is a
scalar multiple (linear combination) of the one basis vector. The set of all such vectors in U describes a line in R3 ; in
other words, U is a line in R3 . Furthermore, as U is a subspace, in particular it contains the additive identity (0, 0, 0) ∈ R3 .
Therefore, U must be a line in R3 that passes through the origin.
• Case 3: Suppose dim U = 2. Then the length of the basis of U is 2; in other words, the basis of U contains two nonzero
vectors. This means in particular the list contains two nonzero vectors that spans all of U; every vector in U is a linear
combination of the two basis vectors. The set of all such vectors in U describes a plane in R3 ; in other words, U is a
plane in R3 . Furthermore, as U is a subspace, in particular it contains the additive identity (0, 0, 0) ∈ R3 . Therefore, U
must be a plane in R3 that passes through the origin.
• Case 4: Suppose dim U = 3. Notice that we have dim R3 = 3. Therefore, we have dim U = dim R3 . By Exercise 2.C.1,
we must have U = R3 .
The cases complete our proof.
2.C.11. Suppose that U and W are subspaces of R8 such that dim U = 3, dim W = 5, and U + W = R8 . Prove that R8 = U ⊕ W.
Proof. We recall from 2.43 of Axler the formula for the dimension of a sum for our subspaces U and W of R8 :
By Exercise 1.C.1, we get U ∩ W = {0}. So we have U + w = R8 and U ∩ W = {0}, and so by 1.45 of Axler we conclude that
U + W is a direct sum, which means we have R8 = U + W = U ⊕ W.
2.C.12. Suppose U and W are both five-dimensional subspaces of R9 . Prove that we have U ∩ W , {0}.
Proof. We recall from 2.43 of Axler the formula for the dimension of a sum for our subspaces U and W of R9 :
Since U and W are both five-dimensional subspaces of R9 , we have dim U = dim W = 5. By 1.39 of Axler, U + W is a
subspace of R9 . By 2.38 of Axler, we get dim(U + W) ≤ 9. So we have
1 = dim(U ∩ W)
= dim{0}
= 0,
2.C.13. Suppose U and W are both 4-dimensional subspaces of C6 . Prove that there exist two vectors U ∩ W such that neither of these
vectors is a scalar multiple of the other.
Proof. We recall from 2.43 of Axler the formula for the dimension of a sum for our subspaces U and W of R9 :
Since U and W are both four-dimensional subspaces of C6 , we have dim U = dim W = 4 and, according to Exercise 1.C.10 of
Axler, U ∩ W is a subspace of C6 . Furthermore, by 2.26 of Axler, U ∩ W is finite-dimensional, and so by 2.32 of Axler there
exists a basis of U ∩ W. By 1.39 of Axler, U + W is a subspace of R9 . By 2.38 of Axler, we get dim(U + W) ≤ 6. So we have
Since we establsihed dim(U ∩ W) ≥ 2, the basis of U ∩ W is at least length 2. As the basis of U ∩ W is a linearly independent
set in U ∩ W, we can find two of the vectors in the basis, neither of which is a scalar multiple of the other.
2.C.15. Suppose V is finite-dimensional, with dim V = n ≥ 1. Prove that there exist 1-dimensional subspaces U1, . . . , Un of V such
that
V = U1 ⊕ · · · ⊕ Un .
Proof. Since V is finite-dimensional, by 2.32 of Axler there exists a basis v1, . . . , vn of V. Let v ∈ V be an arbitrary vector.
Then we can write it uniquely in the form
v = a1 v1 + · · · + an vn
for some a1, . . . , an ∈ F. Let Ui = span(vi ) for each i = 1, . . . , n. Then, since vi is a list of one vector in V, it follows by 2.7
of Axler that Ui is a subspace of V. By construction, the list vi spans Ui , which means we can write each vector in Ui of the
form ai vi ∈ Ui for some ai ∈ F. Furthermore, if ai ∈ F satisfies
ai vi = 0,
then we must have ai = 0 because vi ∈ Ui is a nonzero vector, and so the list vi is linearly independent in Ui . Therefore, vi is
a linearly independent list that spans Ui , which means vi is a basis of Ui . The length of the basis vi is 1, so we get dim Ui = 1
for all i = 1, . . . , n. So we conclude that U1, . . . , Un are 1-dimensional subspaces of V. Now, since the sum U1 + · · · + Un
consists of all possible sums of elements of U1, . . . , Un , we have v ∈ U1 + · · · + Un , and so we obtain the set containment
V ⊂ U1 + · · · + Un . However, by 1.39 of Axler, U1 + · · · + Un is a subspace of V. So we have, in fact, the set equality
V = U1 + · · · + Un . Now, we need to show that the sum U1 + · · · + Un is indeed the direct sum. Consider the vector v = 0.
Then we have
0 = a1 v1 + · · · + an vn
for some a1, . . . , an ∈ F. Since the list v1, . . . , vn is a basis of V, the criterion for a basis (2.29 of Axler) asserts that the
form a1 v1 + · · · + an vn is unique. So the above equation implies that the only way to write the zero vector 0 as a sum of
a1 v1 + · · · + an vn is to take each ai vi ∈ Ui to be equal to 0. By 1.44 of Axler, the sum of the subspaces U1, . . . , Un of V is in
fact a direct sum; that is, we have U1 + · · · + Un = U1 ⊕ · · · ⊕ Un . Therefore, we conclude V = U1 ⊕ · · · ⊕ Un .
2.C.16. Suppose U1, . . . , Um are finite-dimensional subspaces of V such that U1 + · · · + Um is a direct sum. Prove that U1 ⊕ · · · ⊕ Um
is finite-dimensional and
dim(U1 ⊕ · · · ⊕ Um ) = dim U1 + · · · + dim Um .
Proof. Since U1 + · · · + Um is a direct sum, we can write U1 + · · · + Um = U1 ⊕ · · · ⊕ Um . We will use induction to prove the
statement
dim(U1 ⊕ · · · ⊕ Um ) = dim U1 + · · · + dim Um
for all positive integers m.
• Base step: The statement for m = 1 is
dim(U1 ⊕ U2 ) = dim U1 + dim U2,
which we will need to prove. Since U1 + U2 is a direct sum, by 1.45 of Axler, we get U1 ∩ U2 = {0}. Taking dimensions,
we get dim(U1 ∩ U2 ) = dim{0} = 0. Using the formula for the dimension of a sum (2.43 of Axler), we have
dim(U1 ⊕ U2 ) = dim(U1 + U2 )
= dim U1 + dim U2 − dim(U1 ∩ U2 )
= dim U1 + dim U2 − 0
= dim U1 + dim U2 .
We will prove that the statement holds true for m = k + 1. Using our result for the base step with two subspaces and our
assumption for the induction step, we have
Proof. We will give a counterexample to show that this statement is false. Let V = R2 be a vector space, and consider its
subspaces U1 = {(x1, 0) ∈ R2 : x1 ∈ R}, U2 = {(x1, x1 ) ∈ R2 : x1 ∈ R}, and U3 = {(0, x2 ) ∈ R2 : x2 ∈ R}. Then we have the
intersections U1 ∩ U2 = U1 ∩ U3 = U2 ∩ U3 = U1 ∩ U2 ∩ U3 = {(0, 0)} and the sum
U1 + U2 + U3 = {(x1, 0) + (x1, x1 ) + (0, x2 ) ∈ R2 : (x1, 0) ∈ U1, (x1, x1 ) ∈ U2, (0, x2 ) ∈ U3, x1, x2 ∈ R}
= {(2x1, x1 ) + (0, x2 ) ∈ R2 : x1, x2 ∈ R}
= {(2x1, x1 + x2 ) ∈ R2 : x1, x2 ∈ R}
= R2 .
Taking dimensions, we get dim(U1 ∩ U2 ) = dim(U1 ∩ U3 ) = dim(U2 ∩ U3 ) = dim(U1 ∩ U2 ∩ U3 ) = dim{(0, 0)} = 0 and
dim(U1 + U2 + U3 ) = dim R2 = 2. If the above “equation” for dim(U1 + U2 + U3 ) is true, then we would get
2 = dim(U1 + U2 + U3 )
= dim U1 + dim U2 + dim U3 − dim(U1 ∩ U2 ) − dim(U1 ∩ U3 ) − dim(U2 ∩ U3 ) + dim(U1 ∩ U2 ∩ U3 )
=0+0+0−0−0−0+0
= 0,
which is not a true statement. So the “equation” generally does not hold true.