0% found this document useful (0 votes)
20 views5 pages

PS6 Soln

The document is a problem set for a Linear Algebra course at IIT Kanpur, containing various problems related to eigenvalues and eigenvectors of matrices. It includes solutions for finding eigenvalues, verifying eigenspaces, and properties of different types of matrices such as idempotent, nilpotent, and Hermitian matrices. Additionally, it discusses diagonalization, rank, and conditions for matrices to be diagonalizable or not.

Uploaded by

Nitish pathak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views5 pages

PS6 Soln

The document is a problem set for a Linear Algebra course at IIT Kanpur, containing various problems related to eigenvalues and eigenvectors of matrices. It includes solutions for finding eigenvalues, verifying eigenspaces, and properties of different types of matrices such as idempotent, nilpotent, and Hermitian matrices. Additionally, it discusses diagonalization, rank, and conditions for matrices to be diagonalizable or not.

Uploaded by

Nitish pathak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

MTH 102: Linear Algebra

Department of Mathematics and Statistics Indian Institute of Technology - Kanpur

Problem Set 6

Problems marked (T) are for discussions in Tutorial sessions.

1. Find the eigenvalues and corresponding


 eigenvectors
 of matrices given below.
  −1 2 2
1 1  2
(a) (b) 2 2 
4 1
−3 −6 −6

Solution:

(a) (1−λ)2 −4 = 0 ⇒ (λ−3)(λ+1) = 0 ⇒ λ1 = 3, λ2 = −1. Also, v1 = [ 1 2 ]T , v2 = [ −1 2 ]T .


(b) λ1 = 0, λ2 = −2, λ3 = −3 and v1 = [ 0 − 1 1 ]T , v2 = [ −2 1 0 ]T , v3 = [ −1 0 1 ]T .
 
1 0 1
2. (T) Let A = 0 1
 -1  . Verify that xT = (c, c, 0) is the eigenspace for λ = 1.
1 -1 0

(a) The above eigenspace is the null space of what matrix constructed from A?
Solution: The eigenvectors of λ = 1 makes the null space of A − I.
(b) Find the other two eigenvalues of A and two corresponding eigenvectors.
Solution: A has trace 2 and determinant −2. So the two eigenvalues after λ1 = 1 will add
to 1 and multiply to −2. Those are λ2 = 2 and λ3 = −1. Corresponding eigenvectors are :
   
1 1
v2 =  -1  , v3 =  -1  .
1 -2

(c) The diagonalization A = SΛS −1 has a specially nice form because A = At . Is S orthogonal?
If not, can we make it orthogonal?
Solution: S need not be orthogonal.
Yes, it can be made orthogonal. As A is symmetric, the eigenvectors corresponding to
distinct eigenvalues are orthogonal. the eigenvectors corresponding to the same eigenvalue
can be made orthogonal using Gram-Schmidt orthogonalisation process. In this problem,
the orthogonal matrix equals Q, where
 √ √ √ 
1/√2 1/ √3 1/ √6
Q =  1/ 2 −1/√ 3 −1/√6  .
0 1/ 3 −2/ 6

3. Let A be an n×n invertible matrix. Show that eigenvalues of A−1 are reciprocal of the eigenvalues
of A, moreover, A and A−1 have the same eigenvectors.

Solution: Ax = λx ⇒ x = λA−1 x ⇒ A−1 x = 1


λx (Note that λ 6= 0 as A is invertible implies
that det(A) 6= 0).
2

4. Let A be an n×n matrix and α be a scalar. Find the eigenvalues of A−αI in terms of eigenvalues
of A. Further show that A and A − αI have the same eigenvectors.

Solution: If λ is an eigenvalue of A − αI with eigenvector v, then

Av = (A − αI)v + αv = (λ + α)v.

Thus, A and A−αI have same eigenvectors and eigenvalues of A−αI is µ−α if µ is an eigenvalue
of A.

5. (T) Let A be an n × n matrix. Show that AT and A have the same eigenvalues. Do they have
the same eigenvectors?

Solution: Follows directly from det(A − λI) = det((A − λI)t ) = det(At − λI). Eigenvectors are
not same. Here is a counter example :
 
0 0
A= .
1 0

6. Let A be an n × n matrix. Show that:

(a) If A is idempotent (A2 = A) then eigenvalues of A are either 0 or 1.


Solution: Let Av = λv. Then λv = Av = A2 v = λ2 v ⇒ λ(λ − 1)v = 0. Result follows.
(b) If A is nilpotent (Am = 0 for some m ≥ 1) then all eigenvalues of A are 0.
Solution: Let Av = λv. Then Am v = λm v. Now, Am = 0 ⇒ λm = 0 ⇒ λ = 0.
(c) If A∗ = A then, the eigenvalues are all real.
Solution: Let (λ, x) be an eigenpair. Then

λx∗ x = x∗ (λx) = x∗ (Ax) = (x∗ Ax)∗ = x∗ A∗ x = x∗ Ax = λx∗ x = λx∗ x.

Hence, the required result follows.


(d) If A∗ = −A then, the eigenvalues are either zero or purely imaginary.
Solution: Proceed as in the above problem.
(e) Let A be a unitary matrix (AA∗ = I = A∗ A). Then, the eigenvalues of A have absolute
value 1. It follows that if A is real orthogonal then the eigenvalues of A have absolute
value 1. Give an example to show that the conclusion may be false if we allow complex
orthogonal.
Solution: Let (λ, x) be an eigenpair of A. Then

kxk2 = x∗ x = x∗ (A∗ A)x = (x∗ A∗ )(Ax) = (Ax)∗ (Ax) = (λx)∗ (λx) = x∗ λλx = |λ|2 kxk2 .
√ 
2 √i
So |λ|2 = 1. For counter example, take A = .
−i 2
3

7. (T) Suppose that A15 ∗


5×5 = 0. Show that there exists a unitary matrix U such that U AU is
upper triangular with diagonal entries 0. Further, show that A5 = 0.

Solution: There exists U unitary such that U ∗ AU = T , upper triangular with diag(T ) =
{λ1 , . . . , λ5 }. Hence T 15 has diagonal entries λ15 15 ∗ 15 15 we see that
1 , . . . , λ5 . As 0 = U A U = T
λ15
i = 0. So, λi = 0 for all i. As each eigenvalue of A is 0, the characteristic polynomial is given
by pA (x) = x5 . So, by Cayley Hamilton theorem, A5 = 0.
   
1 2 1 1
8. The matrix A = is NOT diagonalizable, whereas is diagonalizable.
0 1 0 2

9. Show that Hermitian, Skew-Hermitian and unitary matrices are normal.

10. Suppose that A = A∗ . Show that rankA = number of nonzero eigenvalues of A. Is this true for
each square matrix? Is this true for each square symmetric complex matrix?

Solution: By spectral theorem, there exists U , unitary such that U ∗ AU = D, diagonal. Since
U is invertible, rankA = rank(U ∗ AU ) = rank(D) = number of nonzero entries of D = number
of nonzero eigenvalues of A.
 
0 1
NOT True: For A = rank(A) = 1, whereas both eigenvalues are 0.
0 0
 
1 i
NOT True : for a general complex symmetric matrix, consider A = . Here rankA = 1,
i −1
whereas both eigenvalues are 0 (as det A = 0, trA = 0).

11. (T) Let S = {u1 , . . . , uk } ⊂ Rn . If S is linearly independent then show that the matrix
k
uj uTj has 0 as an eigenvalue of multiplicity n − k. Show that Rank(A) = k?
P
A=
j=1

Solution: Let {w1 , . . . , wn−k } be a basis of S ⊥ . Then verify that Awi = 0, for 1 ≤ i ≤ n − k.
k  
uj uTj w . As S is linearly independent uTj w = 0
P
if w is any vector with Aw = 0 then 0 =
j=1
for j = 1, 2, . . . , k. Hence, w ∈S⊥
and hence the required result follows.
 
2 −1 0
12. (T) Let A = −1 2 0 . Find S such that S −1 AS is diagonal. Also, compute A6 .

2 2 3

Solution: det(A − λI) = (1 − λ)(3 − λ)2 . Therefore, eigen-values are 1 and 3. The eigen spaces
(null space of A−λI), are given by E1 = {x : Ax = x} = {(x1 , x2 , x3 ) : x2 = x1 , x3 = −2x1 , x1 ∈
R} = LS({(1, 1, −2)}) and E3 = {(x1 , −x1 , x3 ) : x1 , x3 ∈ R} = LS({(1, −1, 0), (0, 0, 1)}).
Clearly, {(1, 1, −2), (1, −1, 0), (0, 0, 1)} are linearly independent. So, A is diagonalizable with
     
1 1 0 1 0 0 1 0 0
A = SDS −1 , where S =  1 −1 0  and D =  0 3 0  ⇒ A6 = S  0 36 0  S −1 .
−2 0 1 0 0 3 0 0 36
4

13. Consider the 3 × 3 matrix  


a b c
A =  1 d e .
0 1 f
Determine the entries a, b, c, d, e, f so that:
• the top left 1 × 1 block is a matrix with eigenvalue 2;
• the top left 2 × 2 block is a matrix with eigenvalue 3 and -3;
• the top left 3 × 3 block is a matrix with eigenvalue 0, 1 and -2.

Solution: Let Ai denote the top left i × i block of A. The matrix A1 is the matrix [a]. Since a
is the only eigenvalue of this matrix, we conclude that a = 2.
 
2 b
We now move onto determining the entries of the matrix A2 : A2 = .
1 d
Since the sum of the eigenvalues of A2 is 0 by hypothesis, and it is also equal to the trace of A2 ,
we obtain that 2 + d = 0 or d = −2. Moreover the product of the eigenvalues of A2 is −9 by
hypothesis, and it is qual to the determinant of A2 . Thus we have
−9 = 2d − b = −4 − b
 
2 5
and we deduce that b = 5 and therefore A2 = .
1 -2
Finally, consider A = A3 . Again, the sum of the eigenvalues of A is −1 and it is also equal to
the trace of A. We deduce that f = −1. We still need to determine the entries c and e of A and
we have  
2 5 c
A =  1 -2 e  .
0 1 -1
The characteristic polynomial of this matrix is
−λ3 − λ2 + (e + 9)λ + c − 2e + 9.
We know that the roots of this polynomial must be 0, 1 and −2. Setting λ = 0 and λ = 1, we
obtain
c − 2e + 9 = 0
−1 − 1 + (e + 9) + c − 2e + 9 = 0
which is equivalent to
c − 2e = −9
c − e = −16.
Thus c = −7 and e = 9 and we conclude
 
2 5 -7
A =  1 -2 9  .
0 1 -1
5

14. NOT for mid-sem or end-sem

(a) Find the eigenvalues and eigenvectors (depending on c) of


 
0.3 c
A= .
0.7 1 − c

For which value of c is the matrix A not diagonallizable (so A = SΛS −1 is impossible)?
Solution: Eigen values are λ = 1 and λ = 0.3 − c. The eigenvector for λ = 1 is in the null
space of    
-0.7 c c
A−I = ⇒ x1 = .
0.7 -c 0.7
Similarly, the eigenvector for λ = 0.3 − c is in the null space of
   
c c 1
A − (0.3 − c)I = ⇒ x2 = .
0.7 0.7 -1

A is not diagonalizable when its eigen values are equal : 1 = 0.3 − c or c = −0.7.
(b) What is the largest range of values of c (real number) so that An approaches a limiting
matrix A∞ as n → ∞?
Solution:  
n −1 1 0
n
A = SΛ S =S S −1 .
0 (0.3 − c)n
This approaches a limit if |0.3 − c| < 1. We could write that out as −0.7 < c < 1.3.
(c) What is the limit of An (still depending on c)? You could work from A = SΛS −1 to find
An .
Solution: The eigen vectors are in S. As n → ∞, the smaller eigen value λn2 goes to zero,
leaving
   
∞ c 1 1 0 1 1
A = /(c + 0.7)
0.7 −1 0 0 0.7 −c
 
c c
= /(c + 0.7).
0.7 0.7

You might also like