0% found this document useful (0 votes)
5 views4 pages

Laps

The document contains a series of linear algebra proofs demonstrating various properties of vector spaces, subspaces, linear transformations, and eigenvectors. Key proofs include showing that the zero element is unique, the range of a linear transformation is a subspace, and that sets of eigenvectors corresponding to distinct eigenvalues are linearly independent. Each proof follows a logical structure, providing necessary conditions and demonstrating closure under operations.

Uploaded by

nosiphonomcebo26
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views4 pages

Laps

The document contains a series of linear algebra proofs demonstrating various properties of vector spaces, subspaces, linear transformations, and eigenvectors. Key proofs include showing that the zero element is unique, the range of a linear transformation is a subspace, and that sets of eigenvectors corresponding to distinct eigenvalues are linearly independent. Each proof follows a logical structure, providing necessary conditions and demonstrating closure under operations.

Uploaded by

nosiphonomcebo26
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Linear Algebra Proofs

SI Leader: Mr.L. Khanyile

02 June 2025
1. In a vector space V , show that k · 0 = 0 for all scalars k.
Solution:

k · 0 = k · (0 + 0)
=k·0+k·0
⇒k·0−k·0=0
⇒k·0=0

2. Prove that the set S of all n × n matrices A such that AT = −A


is a subspace of Mn .
Solution:
Let A, B ∈ S so that AT = −A and B T = −B. Let c ∈ R. Then:

(A + B)T = AT + B T = −A − B = −(A + B)
(cA)T = cAT = −cA

So S is closed under addition and scalar multiplication. Therefore, S is a subspace Mn .

3. Prove that if S = {v1 , v2 , v3 } is linearly dependent and v4 is any


vector in V not in S, then S ′ = {v1 , v2 , v3 , v4 } is also linearly depen-
dent.
Solution:
Since S is linearly dependent, there exist scalars a1 , a2 , a3 , not all zero, such that

a1 v1 + a2 v2 + a3 v3 = 0.

Then
a1 v1 + a2 v2 + a3 v3 + 0 · v4 = 0,
with not all scalars zero, so S ′ is linearly dependent.

4. Let A be an n × n matrix that is diagonalizable. Prove that A


has n linearly independent eigenvectors.
Solution:
Since A is diagonalizable, there exists an invertible matrix P such that:

P −1 AP = D

where D is diagonal with entries λ1 , . . . , λn . Then:

AP = P D

1
Let columns of P be p1 , . . . , pn , so:
Api = λi pi for 1 ≤ i ≤ n

Since P is invertible, its columns are linearly independent. Therefore, A has n linearly independent eigen-
vectors.

5. Let T : V → W be a linear transformation. Prove that the range


of T is a subspace of W .
Solution:
The range of T is defined as Range(T ) = {T (v) : v ∈ V } ⊆ W .
(1) Zero vector: Since T is linear,
T (0V ) = 0W ∈ Range(T ).

(2) Closed under addition and scalar multiplication: Let w1 = T (v1 ) and w2 = T (v2 ) for v1 , v2 ∈ V ,
and let c ∈ R. Then
w1 + cw2 = T (v1 ) + cT (v2 ) = T (v1 + cv2 ) ∈ Range(T ).

Since the range of T contains the zero vector and is closed under addition and scalar multiplication. Thus,
it is a subspace of W .

6. In a vector space V , show that the zero element is unique.


Solution:
Suppose 0 and 0′ are both zero elements. Then for any v ∈ V :

v + 0 = v = v + 0′ ⇒ 0 = 0′

Therefore, the zero element is unique.

7. Prove that the set S = {A ∈ Mn | AB = BA} is a subspace for


fixed B ∈ Mn .
Solution:
The zero matrix satisfies 0 · B = 0 = B · 0, so 0 ∈ S.
Let A1 , A2 ∈ S and c ∈ R. Then:

(A1 + A2 )B = A1 B + A2 B = BA1 + BA2 = B(A1 + A2 ),

(cA1 )B = c(A1 B) = c(BA1 ) = B(cA1 ).

So A1 + A2 ∈ S and cA1 ∈ S, showing closure under addition and scalar multiplication.


Hence, S is a subspace of Mn .

2
8. Let A be an n × n matrix and S = {v1 , . . . , vk } a set of eigenvectors
corresponding to distinct eigenvalues λ1 , . . . , λk . Prove that S is
linearly independent.
Solution:
Assume for contradiction that {v1 , . . . , vk } is linearly dependent. Let r be the largest index such that
{v1 , . . . , vr } is linearly independent. Then {v1 , . . . , vr+1 } is dependent, so:

c1 v1 + · · · + cr+1 vr+1 = 0 (1)

Apply A to both sides:


A(c1 v1 + · · · + cr+1 vr+1 ) = A(0) = 0
A(c1 v1 + · · · + cr+1 vr+1 ) = c1 Av1 + · · · + cr+1 Avr+1 = 0
Since each vi is an eigenvector of A with eigenvalue λi , this becomes:

c1 λ1 v1 + · · · + cr+1 λr+1 vr+1 = 0 (2)

Multiply (1) by λr+1 and subtract (2):

c1 (λr+1 − λ1 )v1 + · · · + cr (λr+1 − λr )vr = 0 (3)

Since {v1 , . . . , vr } is linearly independent and all λi ̸= λr+1 , it follows that:

c1 = · · · = cr = 0

Substitute into (1):


cr+1 vr+1 = 0 ⇒ cr+1 = 0

This contradicts the assumption that not all ci are zero. Therefore, {v1 , . . . , vk } is linearly independent.

You might also like