0% found this document useful (0 votes)
115 views4 pages

Commuting Normal Matrices Theory

This document defines key concepts related to families of commuting normal matrices including: linear subspaces, orthogonal vectors/subspaces, eigenvectors, adjoints, normal matrices, self-adjoint matrices, and unitary matrices. It then presents several problems and proofs about properties of these concepts, such as every normal matrix having orthogonal eigenvectors for different eigenvalues, self-adjoint matrices having real eigenvalues, and unitary matrices having eigenvalues of modulus 1. The main theorem proved is that for any set of commuting normal matrices, there exists an orthonormal basis of eigenvectors for each matrix in the set.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
115 views4 pages

Commuting Normal Matrices Theory

This document defines key concepts related to families of commuting normal matrices including: linear subspaces, orthogonal vectors/subspaces, eigenvectors, adjoints, normal matrices, self-adjoint matrices, and unitary matrices. It then presents several problems and proofs about properties of these concepts, such as every normal matrix having orthogonal eigenvectors for different eigenvalues, self-adjoint matrices having real eigenvalues, and unitary matrices having eigenvalues of modulus 1. The main theorem proved is that for any set of commuting normal matrices, there exists an orthonormal basis of eigenvectors for each matrix in the set.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Families of Commuting Normal Matrices

Denition M.1 (Notation)


i) C
n
=
_
v = (v
1
, , v
n
)

v
i
C for all 1 i n
_
ii) If C and v = (v
1
, , v
n
), w = (w
1
, , w
n
) C
n
, then
v = (v
1
, , v
n
) C
n
v +w = (v
1
+ w
1
, , v
n
+ w
n
) C
n
v, w =
n

j=1
v
j
w
j
C
The means complex conjugate.
iii) Two vectors v, w C
n
are said to be orthogonal (or perpendicular, denoted v w) if
v, w = 0.
iv) If v C
n
and A is the m n matrix whose (i, j) matrix element is A
i,j
, then Av is the
vector in C
m
with
(Av)
i
=
n

j=1
A
i,j
v
j
for all 1 i m
v) A linear subspace V of C
n
is a subset of C
n
that is closed under addition and scalar
multiplication. That is, if C and v, w V , then v, v +w V .
vi) If V is a subset of C
n
, then its orthogonal complement is
V

=
_
v C
n

v w for all w V
_
Problem M.1 Let V C
n
. Prove that V

is a linear subspace of C
n
.
Lemma M.2 Let V be a linear subspace of C
n
of dimension at least one. Let A be an nn
matrix that maps V into V . Then A has an eigenvector in V .
Proof: Let e
1
, , e
d
be a basis for V . As A maps V into itself, there exist numbers
a
i,j
, 1 i, j d such that
Ae
j
=
d

i=1
a
i,j
e
i
for all 1 j d
1
Consequently, A maps the vector w =

d
j=1
x
j
e
j
V to
Aw =
d

i,j=1
a
i,j
x
j
e
i
so that w is an eigenvector of A of eigenvalue if and only if (1) not all of the x
i
s are zero
and (2)
d

i,j=1
a
i,j
x
j
e
i
=
d

i=1
x
i
e
i

d

j=1
a
i,j
x
j
= x
i
for all 1 i d

j=1
_
a
i,j

i,j
_
x
j
= 0 for all 1 i d
For any given , the linear system of equations

d
j=1
_
a
i,j

i,j
_
x
j
= 0 for all 1 i d
has a nontrivial solution (x
1
, , x
d
) if and only if the d d matrix
_
a
i,j

i,j

1i,jd
fails to be invertible and this, in turn, is the case if and only if det
_
a
i,j

i,j

= 0. But
det
_
a
i,j

i,j

= 0 is a polynomial of degree d in and so always vanishes for at least one


value of .
Denition M.3 (Commuting) Two n n matrices A and B are said to commute if
AB = BA.
Lemma M.4 Let n 1 be an integer, V be a linear subspace of C
n
of dimension at least
one and let F be a nonempty set of n n mutually commuting matrices that map V into V .
That is, A, B F AB = BA and A F, w V Aw V . Then there exists a nonzero
vector v V that is an eigenvector for every matrix in F.
Proof: We shall show that
There is a linear subspace W of V of dimension at least one, such that each A F
is a multiple of the identity matrix when restricted to W.
This suces to prove the lemma. The proof will be by induction on the dimension d of V . If
d = 1, we may take W = V , since the restriction of any matrix to a one dimensional vector
space is a multiple of the identity.
Suppose that the claim has been proven for all dimensions strictly less than d. If
every A F is a multiple of the identity, when restricted to V , we may take W = V and we
are done. If not, pick any A F that is not a multiple of the identity when restricted to V .
2
By Lemma M.2, it has at least one eigenvector v V . Let be the corresponding eigenvalue
and set
V

= V
_
w C
n

Aw = w
_
Then V

is a linear subspace of V of dimension strictly less than d (since A, restricted to V ,
is not 1l). We claim that every B F maps V

into V

. To see this, let B F and w V

and set w

= Bw. We wish to show that w

V

. But
Aw

= ABw = BAw (A and B commute)


= Bw (Denition of V

)
= Bw = w

so w

is indeed in V

. We have veried that V

has dimension at least one and strictly smaller
than d and that every B F maps V

into V

. So we may apply the inductive hypothesis
with V replaced by V

.
Denition M.5 (Adjoint) The adjoint of the r c matrix A is the c r matrix
A

i,j
= A
j,i
Problem M.2 Let A and B be any n n matrices. Prove that B = A

if and only if
Bv, w = v, Aw for all v, w C
n
.
Problem M.3 Let A be any n n matrix. Let V be any linear subspace of C
n
and V

its
orthogonal complement. Prove that if AV V (i.e. w V Aw V ), then A

V

V

.
Denition M.6 (Normal, SelfAdjoint, Unitary)
i) An n n matrix A is normal if AA

= A

A. That is, if A commutes with its adjoint.


ii) An n n matrix A is selfadjoint if A = A

.
iii) An n n matrix U is unitary if UU

= 1l. Here 1l is the n n identity matrix. Its (i, j)


matrix element is one if i = j and zero otherwise.
Problem M.4 Let A be a normal matrix. Let be an eigenvalue of A and V the eigenspace
of A of eigenvalue . Prove that V is the eigenspace of A

of eigenvalue

.
Problem M.5 Let A be a normal matrix. Let v and w be eigenvectors of A with dierent
eigenvalues. Prove that v w.
3
Problem M.6 Let A be a self-adjoint matrix. Prove that
a) A is normal
b) Every eigenvalue of A is real.
Problem M.7 Let U be a unitary matrix. Prove that
a) U is normal
b) Every eigenvalue of U obeys || = 1, i.e. is of modulus one.
Theorem M.7 Let n 1 be an integer. Let F be a nonempty set of n n mutually
commuting normal matrices. That is, A, B F AB = BA and A F AA

= A

A.
Then there exists an orthonormal basis {e
1
, , e
n
} of C
n
such that e
j
is an eigenvector of
A for every A F and 1 j n.
Proof: By Lemma M.4, with V = C
n
, there exists a nonzero vector v
1
that is an eigenvector
for every A F. Set e
1
=
v
1
v
1

and V
1
=
_
e
1

C
_
. By Problem M.4, e
1
is also an
eigenvector of A

for every A F, so A

V
1
V
1
for all A F. By Problem M.3, AV

1
V

1
for all A F.
By Lemma M.4, with V = V

1
, there exists a nonzero vector v
2
V

1
that is an
eigenvector for every A F. Choose e
2
=
v
2
v
2

. As e
2
V

1
, e
2
is orthogonal to e
1
. Dene
V
2
=
_

1
e
1
+
2
e
2


1
,
2
C
_
. By Problem M.4, e
2
is also an eigenvector of A

for every
A F, so A

V
2
V
2
for all A F. By Problem M.3, AV

2
V

2
for all A F.
By Lemma M.4, with V = V

2
, there exists a nonzero vector v
3
V

2
that is an
eigenvector for every A F. Choose e
3
=
v
3
v
3

. As e
3
V

2
, e
3
is orthogonal to both e
1
and e
2
. And so on.
4

You might also like