2 Introduction To Quantum Mechanics: Exercise 2.1: Linear Dependence: Example
2 Introduction To Quantum Mechanics: Exercise 2.1: Linear Dependence: Example
Show that (1, −1), (1, 2) and (2, 1) are linearly dependent.
Solution
We observe that:
" # " # " # " # " #
1 1 2 1+1−2 0
+ − = =
−1 2 1 −1 + 2 − 1 0
showing that the three vectors are linearly dependent by definition. Alternatively, we can apply theorem
that states that for any vector space V with dim V = n, any list of m > n vectors in V will be linearly
dependent (here, V = R2 , n = 2, m = 3).
Suppose V is a vector space with basis vectors |0i and |1i, and A is a linear operator from V to V such
that A |0i = |1i and A |1i = |0i. Give a matrix representation for A, with respect to the input basis
|0i, |1i, and the output basis |0i , |1i. Find input and output bases which give rise to a different matrix
representation of A.
Solution
Concepts Involved: Linear Algebra, Matrix Representation of Operators.
" # " #
1 0
Identifying |0i ∼
= and |1i ∼
= , we have that:
0 1
" #
a a01
A = 00
a10 a11
5
tation:
1 1 1
A |+i = √ A(|0i + |1i) = √ (A |0i + A |1i) = √ (|1i + |0i) = |+i
2 2 2
and:
1 1 1
A |−i = √ A( |0i − |1i) = √ (A |0i − A |1i) = √ ( |1i − |0i) = − |−i ,
2 2 2
Remark: If we choose the input and output bases to be different, we can even represent the A operator
as an identity matrix. Specifically, if the input basis to be chosen to be |0i , |1i and output basis as
|1i , |0i , the matrix representation of A looks like:
" #
1 0
A∼= .
0 1
6
Exercise 2.3: Matrix representation for operator products
Suppose A is a linear operator from vector space V to vector space W , and B is a linear operator from
vector space W to vector space X. Let |vi i , |wj i , |xk i be bases for the vector spaces V, W and X
respectively. Show that the matrix representation for the linear transformation BA is the matrix product
of the matrix representations for B and A, with respect to the appropriate bases.
Solution
Concepts Involved: Linear Algebra, Matrix Representation of Operators.
Taking the matrix of representations of A and B to the appropriate bases |vi i , wj , |xk i of V, W and
X, we have that:
X X
A |vj i = Aij |wi i , B |wi i = Bki |xk i
i k
which shows that the matrix representation of BA is indeed the matrix product of the representations of
B and A.
Show that the identity operator on a vector space V has a matrix representation which is one along the
diagonal and zero everywhere else, if the matrix is taken with respect to the same input and output bases.
This matrix is known as the identity matrix.
Solution
Concepts Involved: Linear Algebra, Matrix Representation of Operators.
Let V be a vector space and |vi i be a basis of V . Let A : V 7→ V be a linear operator, and let its matrix
representation taken to be respect to |vi i as the input and output basis. We then have that for each
7
i ∈ {1, . . . , n}:
X X
A|vi i = 1|vi i + 0|vj i = δij |vj i
j6=i j
Exercise 2.5
Solution
Concepts Involved: Linear Algebra, Inner Products.
Recall that on Cn , (·, ·) was defined as:
z1
X .
((y1 , . . . , yn ), (z1 , . . . , zn )) ≡ yi∗ zi = [y1∗ . . . yn∗ ]
.. .
i
zn
Furthermore, recall the three conditions for the function (·, ·) : V × V 7→ C to be considered an inner
product:
8
(2) We have:
X
((y1 , . . . , yn ), (z1 , . . . , zn )) = yi∗ zi
i
X
= (yi zi∗ )∗
i
∗
X
= zi∗ yi
i
∗
= ((z1 , . . . , zn ), (y1 , . . . , yn ))
For y = (y1 , . . . , yn ) 6= 0 we have that at least one yi (say, yj ) is nonzero, and hence:
X
((y1 , . . . , yn ), (y1 , . . . , yn )) = yi2 ≥ yj2 > 0
i
Exercise 2.6
Show that any inner product (·, ·) is conjugate-linear in the first argument,
X X
λi |wi i , |vi = λ∗i ( |wi i , |vi).
i i
Solution
Concepts Involved: Linear Algebra, Inner Products
Applying properties (2) (conjugate symmetry), (1) (linearity in second argument), and (2) (again) in
9
succession, we have that:
∗
X X
λi |wi i , |vi = |vi, λi |wi i
i i
∗
X
= λi (|vi, |wi i)
i
X
= λ∗i (|vi, |wi i)∗
i
X
= λ∗i (|wi i, |vi)
i
Exercise 2.7
Verify that |wi = (1, 1) and |vi = (1, −1) are orthogonal. What are the normalized forms of these
vectors?
Solution
Concepts Involved: Linear Algebra, Inner Products, Orthogonality, Normalization
Recall that two vectors |vi, |wi are orthogonal if hv|wi = 0, and the norm of |vi is given by |vi =
p
hv|vi.
hw|vi = 1 · 1 + 1 · (−1) = 0
Exercise 2.8
Verify that the Gram-Schmidt procedure produces and orthonormal basis for V .
10
Solution
Concepts Involved: Linear Algebra, Linear Independence, Bases, Inner Products, Orthogonality, Normal-
ization, Gram-Schmidt Procedure, Induction.
Recall that given |w1 i, . . . , |wd i as a basis set for a vector space V , the Gram-Schmidt procedure con-
structs a basis set |v1 i, . . . , |vd i by defining |v1 i ≡ |w1 i/ |w1 i and then defining |vk+1 i inductively for
1 ≤ k ≤ d − 1 as:
Pk
|wk+1 i − i=1 hvi |wk+1 i |vi i
|vk+1 i ≡ Pk
|wk+1 i − i=1 hvi |wk+1 i |vi i
It is evident that each of the |vj i have unit norm as they are defined in normalized form. It therefore
suffices to show that each of the |v1 i, . . . , |vd i are orthogonal to each other, and that this set of vectors
forms a basis of V . We proceed by induction. For k = 1, we have that:
Therefore:
hv1 |w2 i − hv1 |w2 i hv1 |v1 i hv1 |w2 i − hv1 |w2 i
hv1 |v2 i = = =0
|w2 i + hv1 |w2 i |v1 i |w2 i + hv1 |w2 i |v1 i
so the two vectors are orthogonal. Furthermore, they are linearly independent; if they were linearly
dependent, we could write |v1 i = λ|v2 i for some λ ∈ C, but then multiplying both sides by hv1 | we get:
which is a contradiction. This concludes the base case. For the inductive step, let k ≥ 1 and suppose that
|v1 i, . . . , |vk i are orthogonal and linearly independent. We then have that:
Pk
|wk+1 i − i=1 hvi |wk+1 i |vi i
|vk+1 i = Pk
|wk+1 i − i=1 hvi |wk+1 i |vi i
where in the second equality we use the fact that vj vi = δij for i, j ∈ {1, . . . k} by the inductive hypoth-
esis. We therefore find that |vk+1 i is orthogonal to all of |v1 i, . . . , |vk i. Furthermore, |v1 i, . . . , |vk i, |vk+1 i
is lienarly independent. Suppose for the sake of contradiction that this was false. Then, there would exist
λ1 , . . . λk not all nonzero such that:
11
but then multiplying both sides by hvk+1 | we have that:
by orthonormality. This gives a contradiction, and hence |v1 i, . . . , |vk i, |vk+1 i are linearly independent,
finishing the inductive step. Therefore, |v1 i, . . . , |vd i is an orthonormal list of vectors which is linearly
independent. Since |w1 i, . . . , |wd i is a basis for V , then V has dimension d. Hence, |v1 i, . . . , |vd i being
a linearly independent list of d vectors in V is a basis of V . We conclude that it is an orthonormal basis
of V , as claimed.
The Pauli matrices (Figure 2.2 on page 65) can be considered as operators with respect to an orthonormal
basis |0i, |1i for a two-dimensional Hilbert space. Express each of the Pauli operators in the outer product
notation.
Solution
Concepts Involved: Linear Algebra, Matrix Representation of Operators, Outer Products.
Recall that if A has matrix representation:
" #
∼ a00 a01
A=
a10 a11
with respect to |0i, |1i as the input/output bases, then we can express A in outer product notation as:
Furthermore, recall the representation of the Pauli matrices with respect to the orthonormal basis |0i, |1i:
" # " # " # " #
1 0 0 1 0 −i 1 0
I= X= Y = Z=
0 1 1 0 i 0 0 −1
I = |0ih0| + |1ih1|
X = |0ih1| + |1ih0|
Y = −i |0ih1| + i |1ih0|
Z = |0ih0| − |1ih1|
Exercise 2.10
Suppose |vi i is an orthonormal basis for an inner product space V . What is the matrix representation for
the operator |vj ihvj |, with respect to the |vi i basis?
12
Solution
Concepts Involved: Linear Algebra, Matrix Representation of Operators, Outer Products.
The matrix representation of |vj ihvj | with respect to the |vi i basis is a matrix with 1 in the jth column
and row (i.e. the (j, j)th entry in the matrix) and 0 everywhere else.
Exercise 2.11
Find the eigenvectors, eigenvalues, and diagonal representations of the Pauli matrices X, Y and Z.
Solution
Concepts Involved: Linear Algebra, Eigenvalues, Eigenvectors, Diagonalization.
Given an operator A on a vector space V , recall that an eigenvector |vi of A and its corresponding
eigenvalue λ are defined by:
A|vi = λ|vi
Where |ii form an orthonormal set of eigenvectors for A, and λi are the corresponding eigenvalues.
From which we obtain λ1 = 1, λ2 = −1. Solving for the eigenvectors, we then have that:
" #" # " #
−1 1 v11 0
(X − Iλ1 )|v1 i = 0 =⇒ = =⇒ v11 = 1, v12 = 1
1 −1 v12 0
" #" # " #
1 1 v21 0
(X − Iλ2 )|v2 i = 0 =⇒ = =⇒ v21 = 1, v22 = −1
1 1 v22 0
13
We do the same for Y . Solving for the eigenvalues:
" #
−λ −i
det(A − Iλ) = 0 =⇒ det = 0 =⇒ λ2 − 1 = 0
i −λ
From which we obtain λ1 = 1, λ2 = −1. Solving for the eigenvectors, we then have that:
" #" # " #
−1 −i v11 0
(Y − Iλ1 )|v1 i = 0 =⇒ = =⇒ v11 = 1, v12 = i
i −1 v12 0
" #" # " #
1 −i v21 0
(Y − Iλ2 )|v2 i = 0 =⇒ = =⇒ v21 = 1, v22 = −i
i 1 v22 0
We therefore have that |v1 i = |0i + i|1i, |v2 i = |0i − i|1i. Normalizing by dividing by |v1 i = |v2 i ,
we obtain that:
|0i + i|1i |0i − i|1i
|v1 i = |y+ i = √ , |v2 i = |y− i = √ .
2 2
For Z, the process is again the same. We give the results and omit the details:
Z = |0ih0| − |1ih1|
Exercise 2.12
Prove that the matrix
" #
1 0
1 1
is not diagonalizable.
Solution
Concepts Involved: Linear Algebra, Eigenvalues, Eigenvectors, Diagonalization.
14
But since the eigenvalue 1 is degenerate, the matrix only has one eigenvector; it therefore cannot be
diagonalized.
Exercise 2.13
If |wi and |vi are any two vectors, show that (|wihv|)† = |vihw|.
Solution
Concepts Involved: Linear Algebra, Adjoints.
We observe that:
(( |wihv|)† |xi, |yi) = (|xi, ( |wihv|)|yi) = (|xi, hv|yi |wi) = hx| hv|yi |wi
= hx|wi hv|yi
= hx|wi (|vi, |yi)
∗
= (hx|wi |vi, |yi)
= (hw|xi |vi, |yi)
= (( |vihw|)|xi, |yi)
Where in the third-to last equality we use the conjugate linearity in the first argument (see Exercise 2.6)
∗
and in the second-to last equality we use that ha|bi = hb|ai. Comparing the first and last expressions,
†
we conclude that ( |wihv|) = |vihw|.
a∗i A†i .
X X
a i Ai =
i i
Solution
Concepts Involved: Linear Algebra, Adjoints.
We observe that:
†
X X X
ai Ai |ai, |bi = |ai, ai Ai |bi = ai |ai, Ai |bi
i i i
ai A†i |ai, |bi
X
=
i
†
X
= a∗i Ai |ai, |bi
i
15
where we invoke the definition of the adjoint in the first and third equalities, the linearity in the second
argument in the second equality, and the conjugate linearity in the first argument in the last equality. The
claim is proven by comparing the first and last expressions.
Exercise 2.15
Solution
Concepts Involved: Linear Algebra, Adjoints
Applying the definition of the Adjoint twice (and using the conjugate symmetry of the inner product) we
have that:
∗
((A† )† |ai, |bi) = (|ai, A† |bi) = (A† |bi, |ai)∗ = (|bi, A|ai)∗ = (A|ai, |bi)∗ = (A|ai, |bi).
Exercise 2.16
Solution
Concepts Involved: Linear Algebra, Projectors.
Let |1i, . . . , |ki be an orthonormal basis for the subspace W of V . Then, using the definition of the
projector onto W , we have that:
Xk k
X Xk Xk X k
k X k
X
P2 = P · P = |iihi| i0 i0 = |iihi|i0 ihi0 | = |iiδii0 hi0 | = |iihi| = P
i=1 i0 =1 i=1 i0 =1 i=1 i0 =1 i=1
where in the fourth/fifth equality we use the orthonormality of the basis to collapse the double sum.
Exercise 2.17
Show that a normal matrix is Hermitian if and only if it has real eigenvalues.
Solution
Concepts Involved: Linear Algebra, Hermitian Operators, Normal Operators, Spectral Decomposition.
P
=⇒ Let A be a Normal and Hermitian matrix. Then, it has a diagonal representation A = i λi |iihi|
where |ii is an orthonormal basis for V and each |ii is an eigenvector of A with eigenvalue λi . By the
16
Hermicity of A, we have that A = A† . Therefore, we have that:
†
X X X
A† = λi |iihi| = λ∗i |iihi| = A = λi |iihi|
i i i
where we use the results of Exercises 2.13 and 2.14 in the second equality. Comparing the third and last
expressions, we have that λi = λi ∗ and hence the eigenvalues are real.
⇐= Let A be a Normal matrix with real eigenvalues. Then, A has diagonal representation A =
P
i λi |iihi| where λi are all real. We therefore have that:
†
X X
A† = λi |iihi| = λ∗i |iihi| = λi |iihi| = A
i i
where in the third equality we use that λ∗i = λi . We conclude that A is Hermitian.
Exercise 2.18
Show that all eigenvalues of a unitary matrix have modulus 1, that is, can be written in the form eiθ for
some real θ.
Solution
Concepts Involved: Linear Algebra, Unitary Operators, Spectral Decomposition
2
From which we obtain that |λi | = 1, and hence |λi | = 1, proving the claim.
17
Exercise 2.19: Pauli matrices: Hermitian and unitary
Solution
Concepts Involved: Linear Algebra, Hermitian Matrices, Unitary Matrices
We check I, X, Y, Z in turn.
#T ∗ "
" #∗ " #
1 0 1 0 1 0
I† = = = =I
0 1 0 1 0 1
I † I = II = I
#T ∗ "
" #∗ " #
0 1 0 1 0 1
X† = = = =X
1 0 1 0 1 0
18
Exercise 2.20: Basis changes
Suppose A0 and A00 are matrix representations of an operator A on a vector space A on a vector space
V with respect to two different orthonormal bases, |vi i and |wi i . Then the elements of A0 and A00 are
A0ij = hvi |A|vj i and A00ij = hwi |A|wj i. Characterize the relationship between A0 and A00 .
Solution
Concepts Involved: Linear Algebra, Matrix Representations of Operators, Completeness Relation
Exercise 2.21
Repeat the proof of the spectral decomposition in Box 2.2 for the case when M is Hermitian, simplifying
the proof wherever possible.
Solution
Concepts Involved: Linear Algebra, Hermitian Operators, Spectral Decomposition.
Note that the converse of Theorem 2.1 does not hold if we replace “normal” with “Hermitian”. Diagonal-
izability does not imply Hermicity, with a concrete example being S = |0ih0| + i |1ih1|. So, we just prove
the forwards direction.
We proceed by induction on the dimension d of V . The d = 1 case is trivial as M is already diagonal in
any representation in this case. Let λ be an eigenvalue of M , P the projector onto the λ subspace, and
Q the projector onto the orthogonal complement. Then M = (P + Q)M (P + Q) = P M P + QM P +
P M Q + QM Q. Obviously P M P = λP. Furthermore, QM P = 0, as M takes the subspace P into itself.
We claim that P M Q = 0 also. To see this, we recognize that (P M Q)† = Q† M † P † = QM P = 0. and
hence P M Q = 0. Thus M = P M P + QM Q. QM Q is normal, as (QM Q)† = Q† M † Q† = QM Q (and
Hermiticity implies that the operator is normal). By induction, QM Q is diagonal with respect to some
orthonormal basis for the subspace Q, and P M P is already diagonal with respect to some orthonormal
basis for P . It follows that M = P M P + QM Q is diagonal with respect to some orthonormal basis for
the total vector space.
Exercise 2.22
Prove that two eigenvectors of a Hermitian operator with different eigenvalues are necessarily orthogonal.
19
Solution
Concepts Involved: Linear Algebra, Eigenvalues, Eigenvectors, Hermitian Operators.
Let A be a Hermitian operator, and let |v1 i, |v2 i be two eigenvectors of A with corresponding eigenvalues
λ1 , λ2 such that λ1 6= λ2 . We then have that:
where we use the Hermiticity of A in the second line. Substracting the first line from the second, we have
that:
Since λ1 6= λ2 by assumption, the only way this equality is satisfied is if hv1 |v2 i = 0. Hence, |v1 i, |v2 i are
orthogonal.
Exercise 2.23
Show that the eigenvalues of a projector P are all either 0 or 1.
Solution
Concepts Involved: Linear Algebra, Eigenvalues, Eigenvectors, Projectors.
Let P be a projector, and |vi be an eigenvector of P with corresponding eigenvalue λ. From Exercise
2.16, we have that P 2 = P , and using this fact, we observe:
P |vi = λ|vi
P |vi = P 2 |vi = P P |vi = P λ|vi = λP |vi = λ2 |vi.
Since |vi is not the zero vector, we therefore obtain that either λ = 0 or λ = 1.
Show that a positive operator is necessarily Hermitian. (Hint: Show that an arbitrary operator A can be
written A = B + iC where B and C are Hermitian.)
Solution
Concepts Involved: Linear Algebra, Hermitian Operators, Positive Operators
20
Let A be an operator. We first make the observation that we can write A as:
A A A† A† A + A† A − A†
A= + + − = +i .
2 2 2 2 2 2i
A+A† A−A†
So let B = 2 and C = 2i . B and C are Hermitian, as:
!†
† A + A† A† + (A† )† A† + A
B = = = =B
2 2 2
!†
† A − A† A† − (A† )† A − A†
C = = = =C
2i −2i 2i
so we have hence proven that we can write A = B + iC for hermitian B, C for any operator A. Now,
assume that A is positive. We then have that for any vector |vi:
hv|A|vi ≥ 0.
hv|B|vi + ihv|C|vi ≥ 0.
Exercise 2.25
Solution
Concepts Involved: Linear Algebra, Adjoints, Positive Operators
Let A be an operator. Let |vi be an arbitrary vector, and then we then have that:
|vi, A† A|vi = (A† )† |vi, A|vi = A|vi, A|vi .
By the property of inner products, the expression must be greater than zero.
Exercise 2.26
√ ⊗2 ⊗3
Let |ψi = ( |0i + |1i)/ 2. Write out |ψi and |ψi explicitly, both in terms of tensor products like
|0i|1i and using the Kronecker product.
Solution
Concepts Involved: Linear Algebra, Tensor Products, Kronecker Products.
21
Using the definition of the tensor product, we have:
" #
√1
1
√1 2
√1 21
⊗2 2
|0i|0i + |0i|1i + |1i|0i + |1i|1i ∼ " 2 #
|ψi = = = 21
2 √1
1
2
2
√ 1
2 √1 2
2
Exercise 2.27
Calculate the matrix representation of the tensor products of the Pauli operators (a) X and Z; (b) I and
X; (c)X and I. Is the tensor product commutative?
Solution
Concepts Involved: Linear Algebra, Tensor Products, Kronecker Products.
(a)
" # " #
" # 0 1 0
1
1 0 0 0 1 0
0Z 1Z
0 −1 0 −1 0
0 0 −1
X ⊗Z = = " =
# " #
1Z 0Z 1 0 0 0
1 0 1 0
1 0 −1
0 −1 0 −1 0 0 0
22
(b)
" # " #
# 1 0 1 0 1 0 1 0 0
"
1 0 0
1X 0X 1 0 1
0 0 0
I ⊗X = = " =
# " #
0X 1X 0 0 0 1
0 1 0 1
0 1
1 0 1 0 0 0 1 0
(c)
" # " #
" # 0 1 0
1
1 0 0 0 1 0
0 1 0 1
0I 1I 0 0 0 1
X ⊗I = = " =
# " #
1I 0I 1 0 0 0
1 0 1 0
1 0
0 1 0 1 0 1 0 0
Comparing (b) and (c), we conclude that the tensor product is not commutative.
Exercise 2.28
Show that the transpose, complex conjugation and adjoint operations distribute over the tensor product,
Solution
Concepts Involved: Linear Algebra, Adjoints, Tensor Products, Kronecker Products.
. . . . . . . .
Am1 B Am2 B . . . Amn B A∗m1 B ∗ A∗m2 B ∗ ... A∗mn B ∗
T
A11 B A12 B ... A1n B A11 B T A21 B T ... An1 B T
A21 B A22 B ... A2n B A12 B T A22 B ... An2 B T
(A ⊗ B)T = = AT ⊗ B T .
.. .. .. .. = .. .. .. ..
. . . . . . . .
Am1 B Am2 B . . . Amn B A1m B T A2m B T ... Anm B T
The relation for the distributivity of the hermitian conjugate over the tensor product then follows from the
former two relations:
23
Exercise 2.29
Show that the tensor product of two unitary operators is unitary.
Solution
Concepts Involved: Linear Algebra, Unitary Operators, Tensor Products
Suppose A, B are unitary. Then, A† A = I and B † B = I. Using the result of the Exercise 2.28, we then
have that:
Exercise 2.30
Show that the tensor product of two Hermitian operators is Hermitian.
Solution
Concepts Involved: Linear Algebra, Hermitian Operators, Tensor Products
Suppose A, B are Hermitian. Then, A† = A and B † = B. Then, using the result of Exercise 2.28, we
have:
(A ⊗ B)† = A† ⊗ B † = A ⊗ B
Exercise 2.31
Show that the tensor product of two positive operators is positive.
Solution
Concepts Involved: Linear Algebra, Positive Operators
Suppose A, B are positive operators. We then have that hv|A|vi ≥ 0 and hw|B|wi ≥ 0. Therefore, for
any |vi ⊗ |wi:
|vi ⊗ |wi, A ⊗ B(|vi ⊗ |wi) = hv|A|vihw|B|wi ≥ 0
24
Exercise 2.32
Show that the tensor product of two projectors is a projector.
Solution
Concepts Involved: Linear Algebra, Projectors
Let P1 , P2 be projectors. We then have that P12 = P1 and P22 = P2 by Exercise 2.16. Therefore:
so P1 ⊗ P2 is a projector.
Exercise 2.33
The Hadamard operator on one qubit may be written as
1
H = √ [(|0i + |1i)h0| + (|0i − |1i)h1|]
2
1 X
H ⊗n = √ (−1)x·y |xihy|
2n x,y
Solution
Concepts Involved: Linear algebra, Matrix Representation of Operators, Outer Products.
Looking at the form of the Hadamard operator on one qubit, we observe that:
1
H = √ |0ih0| + |0ih1| + |1ih0| − |1ih1|
2
Hence:
1 X
H=√ (−1)x·y |xihy|
2 x,y
Where x, y run over 0 and 1. Taking the n-fold tensor product of this expression, we get:
1 X 1 X 1 X
H⊗ = √ (−1)x·y |xihy| ⊗ √ (−1)x·y |xihy| ⊗ . . . ⊗ √ (−1)x·y |xihy|
2 x,y 2 x,y 2 x,y
1 X
=√ (−1)x·y |xihy|
2n x,y
25
Now explicitly writing H ⊗2 , we have:
1 X
H ⊗2 = √ (−1)(x·y) |xihy|
22 x,y
1 1 1 1
11 −1 1 −1
∼
=
2 1 1 −1 −1
1 −1 −1 1
Note that here, x, y are binary length 2 strings. The sum goes through all pairwise combinations of
x, y ∈ {00, 01, 10, 11}.
Exercise 2.34
Find the square root and logarithm of the matrix
" #
4 3
3 4
Solution
Concepts Involved: Linear Algebra, Spectral Decomposition, Operator Functions
We begin by diagonalizing the matrix (which we call A) as to be able to apply the definition of oper-
ator functions. By inspection, A is Hermitian as it is equal to its conjugate transpose, so the spectral
decomposition exists. Solving for the eigenvalues, we consider the characterstic equation:
" #
4−λ 3
det(A − λI) = 0 =⇒ det = 0 =⇒ (4 − λ)2 − 9 = 0 =⇒ λ2 − 8λ + 7 = 0
3 4−λ
Using the quadratic equation, we get λ1 = 1, λ2 = 7. Using this to find the eigenvectors of the matrix,
we have:
" #" #
4−1 3 v11
= 0 =⇒ v11 = 1, v12 = −1
3 4 − 1 v12
" #" #
4−7 3 v21
= 0 =⇒ v21 = 1, v22 = 1
3 4 − 7 v22
26
Hence our normalized eigenvectors are:
" # " #
√1 √1
|v1 i = 2 , |v2 i = 2
−1
√ √1
2 2
Let v be any real, three-dimensional unit vector and θ a real number. Prove that
Solution
Concepts Involved: Linear Algebra, Spectral Decomposition, Operator Functions.
Recall that σ1 ≡ X, σ2 ≡ Y , and σ3 ≡ Z.
In order to compute the complex exponential of this matrix, we will want to find its spectral decomposition.
27
Using the characterstic equation to find the eigenvalues, we have:
" #
v3 − λ v1 − iv2
det(v · σ − Iλ) = 0 =⇒ det =0
v1 + iv2 −v3 − λ
=⇒ (v3 − λ)(−v3 − λ) − (v1 − iv2 )(v1 + iv2 ) = 0
=⇒ λ2 − v32 − v12 − v22 = λ2 − (v12 + v22 + v32 ) = 0
=⇒ λ2 − 1 = 0
=⇒ λ1 = 1, λ2 = −1
where in the second-to-last implication we use the fact that v is a unit vector. Letting |v1 i, |v2 i be the
associated eigenvectors, v · σ has spectral decomposition:
Where in the last line we use the completeness relation and the spectral decomposition.
Exercise 2.36
Show that the Pauli matrices except for I have trace zero.
Solution
Concepts Involved: Linear Algebra, Trace.
We have that:
" #
0 1
tr(X) = tr =0+0=0
1 0
" #
0 −i
tr(Y ) = tr =0+0=0
i 0
" #
1 0
tr(Z) = tr =1−1=0
0 −1
28
Exercise 2.37: Cyclic property of the trace
tr(AB) = tr(BA)
Solution
Concepts Involved: Linear Algebra, Trace.
P
Let A, B be linear operators. Then, C = AB has matrix representation with entries Cij = k Aik Bkj
P
and D = BA has matrix representation with entries Dij = k Bik Akj . We then have that:
X XX XX X
tr(AB) = tr(C) = Cii = Aik Bki = Bki Aik = Dkk = tr(D) = tr(BA)
i i k k i k
tr(zA) = z tr(A).
Solution
Concepts Involved: Linear Algebra, Trace.
X X
tr(zA) = (zA)ii = z Aii = z tr(A)
i i
29
Exercise 2.39: The Hilbert-Schmidt inner product on operators
The set LV of linear operators on Hilbert space V is obviously a vector space - the sum of two linear
operators is a linear operator, zA is a linear operator if A is a linear operator and z is a complex number,
and there is a zero element 0. An important additional result is that the vector space LV can be given a
natural inner product structure, turning it into a Hilbert space.
is an inner product function. This inner product is known as the Hilbert-Schmidt or trace inner
product.
(3) Find an orthonormal basis of Hermitian matrices for the Hilbert space LV .
Solution
Concepts Involved: Linear Algebra, Trace, Inner Products, Hermitian Operators, Bases
(1) We show that (·, ·) satisfies the three properties of an inner product. Showing that it is linear in the
second argument, we have that:
X X X X
A, λi Bi = trA λi Bi = λi tr(ABi ) = λi (A, Bi )
i i i i
where in the second to last equality we use the result of Exercise 2.38. To see that it is conjugate-
symmetric, we have that:
∗
(A, B) = tr A† B = tr (B † A)† = tr B † A = (B, A)∗
(2) Suppose V has d dimensions. Then, the elements of LV which consist of linear operators A : V 7→ V
have representations as d × d matrices. There are d2 such linearly independent matrices (take the
matrices with 1 in one of the d2 entries and 0 elsewhere), and we conclude that LV has d2 linearly
independent vectors and hence dimension d2 .
(3) As discussed in the previous part of the question, one possible basis for this vector space would
be |vi ihvj | where |vk i form an orthonormal basis of V with i, j ∈ {1, . . . d}. These of course are
just matrices with 1 in one entry and 0 elsewhere. It is easy to see that this is a basis as for any
30
P
A ∈ LV we can write A = ij λij |vi ihvj |. We can verify that these are orthonormal; suppose
|vi1 ihvj1 | =
6 |vi2 ihvj2 |. Then, we have that:
|vi1 ihvj1 | , |vi2 ihvj2 | = tr ( |vi1 ihvj1 |)† |vi2 ihvj2 |
= tr |vj1 ihvi1 |vi2 ihvj2 |
If |vi1 i =
6 |vi2 i, then the above expression reduces to tr(0) = 0. If |vi1 i = |vi2 i, then it follows that
|vj1 i =
6 |vj2 i (else this would contradict |vi1 ihvj1 | =
6 |vi2 ihvj2 |) and in this case we have that:
|vi1 ihvj1 | , |vi2 ihvj2 | = tr |vj1 ihvi1 |vi2 ihvj2 |
= tr |vj1 ihvj2 |
=0
So we therefore have that the inner product of two non-identical elements in the basis is zero.
Furthermore, we have that:
|vi1 ihvj1 | , |vi1 ihvj1 | = tr |vi1 ihvj1 | |vi1 ihvj1 | = tr |vi1 ihvi1 | = 1
so we confirm that this basis is orthonormal. However, evidently this basis is not Hermitian as if
i 6= j, then ( |vi ihvj |)† = |vj ihvi | 6= |vi ihvj |. To fix this, we can modify our basis slightly. We keep
the diagonal entries as is (as these are indeed Hermitian!) but for the off-diagonals, we replace every
pair of basis vectors |vi ihvj |, |vj ihvi | with:
It now suffices to show that these new vectors (plus the diagonals) form a basis and are orthonormal.
To see that these form a basis, observe that:
1 |vi ihvj | + |vj ihvi | i |vi ihvj | − |vj ihvi |
√ √ −√ i √ = |vi ihvj |
2 2 2 2
1 |vi ihvj | + |vj ihvi | i |vi ihvj | − |vj ihvi |
√ √ +√ i √ = |vj ihvi |
2 2 2 2
and since we know that |vi ihvj | for all i, j ∈ {1, . . . d} form a basis, this newly defined set of vectors
must be a basis as well. Furthermore, since the new basis vectors are constructed from orthogonal
|vi ihvj |, the newly defined vectors will be orthogonal to each other if i1 , j1 6= i2 , j2 . The only things
31
left to check is that for any choice of i, j that:
are orthogonal, and that these vectors are normalized. Checking the orthogonality, we have:
!
|vi ihvj | + |vj ihvi | |vi ihvj | − |vj ihvi | |vi ihvj | − |vj ihvi |
|vi ihvj | + |vj ihvi |
√ ,i √ = tr √ i √
2 2 2 2
i
= √ tr |vj ihvj | − |vi ihvi |
2
= 0.
!
|vi ihvj | − |vj ihvi | |vi ihvj | − |vj ihvi | |vi ihvj | − |vj ihvi | |vi ihvj | − |vj ihvi |
i √ ,i √ = tr i √ i √
2 2 2 2
1
= − tr − |vi ihvi | − |vj ihvj |
2
=1
There is an elegant way of writing this using jkl , the antisymmetric tensor on three indices, for which
jkl = 0 except for 123 = 231 = 312 = 1, and 321 = 213 = 132 = −1:
3
X
[σj , σk ] = 2i jkl σl
l=1
Solution
Concepts Involved: Linear Algebra, Commutators.
32
We verify the proposed relations via computation in the computational basis:
" #" # " #" # " # " #
0 1 0 −i 0 −i 0 1 i 0 −i 0
[X, Y ] = XY − Y X = − = − = 2iZ
1 0 i 0 i 0 1 0 0 −i 0 i
" #" # " #" # " # " #
0 −i 1 0 1 0 0 −i 0 i 0 −i
[Y, Z] = Y Z − ZY = − = − = 2iX
i 0 0 −1 0 −1 i 0 i 0 −i 0
" #" # " #" # " # " #
1 0 0 1 0 1 1 0 0 1 0 −1
[Z, X] = ZX − XZ = − = − = 2iY
0 −1 1 0 1 0 0 −1 −1 0 1 0
{σi , σj } = 0
Where i 6= j are both chosen from the set 1, 2, 3. Also verify that (i = 0, 1, 2, 3)
σi2 = I
Solution
Concepts Involved: Linear Algebra, Anticommutators.
We again verify the proposed relations via computation in the computational basis:
" #" # " #" # " # " # " #
0 1 0 −i 0 −i 0 1 i 0 −i 0 0 0
{X, Y } = XY + Y X = + = + =
1 0 i 0 i 0 1 0 0 −i 0 i 0 0
" #" # " #" # " # " # " #
0 −i 1 0 1 0 0 −i 0 i 0 −i 0 0
{Y, Z} = Y Z + ZY = + = + =
i 0 0 −1 0 −1 i 0 i 0 −i 0 0 0
" #" # " #" # " # " # " #
1 0 0 1 0 1 1 0 0 1 0 −1 0 0
{Z, X} = ZX + XZ = + = + = .
0 −1 1 0 1 0 0 −1 −1 0 1 0 0 0
This proves the first claim as {A, B} = AB + BA = BA + AB = {B, A} and the other 3 relations are
33
equivalent to the ones already proven. Verifying the second claim, we have:
" #" # " #
1 0 1 0 1 0
I2 = =
0 1 0 1 0 1
" #" # " #
2 0 1 0 1 1 0
X = =
1 0 1 0 0 1
" #" # " #
2 0 −i 0 −i 1 0
Y = =
i 0 i 0 0 1
" #" # " #
2 1 0 1 0 1 0
Z = =
0 −1 0 −1 0 1
Remark: Note that we can write this result consicely as {σj , σk } = 2δij I
Exercise 2.42
Verify that
[A, B] + {A, B}
AB =
2
Solution
Concepts Involved: Linear Algebra, Commutators, Anticommutators.
Exercise 2.43
Show that for j, k = 1, 2, 3,
3
X
σj σk = δjk I + i jkl σl .
l=1
Solution
Concepts Involved: Linear Algebra, Commutators, Anticommutators.
34
Applying the results of Exercises 2.40, 2.41, and 2.42, we have:
[σj , σk ] + {σj , σk }
σj σk =
2
P3
2i l=1 jkl σl + 2δij I
=
2
X3
= δij I + i jkl σl
l=1
Exercise 2.44
Solution
Concepts Involved: Linear Algebra, Commutators, Anticommutators.
[A, B] = AB − BA = 0
{A, B} = AB + BA = 0.
2AB = 0 =⇒ AB = 0.
A−1 AB = A−1 0 =⇒ IB = 0 =⇒ B = 0.
Exercise 2.45
Solution
Concepts Involved: Linear Algebra, Commutators, Adjoints.
35
Exercise 2.46
Solution
Concepts Involved: Linear Algebra, Commutators
Exercise 2.47
Solution
Concepts Involved: Linear Algebra, Commutators, Hermitian Operators
Suppose A, B are Hermitian. Using the results of Exercises 2.45 and 2.46, we have:
Exercise 2.48
What is the polar decomposition of a positive matrix P ? Of a unitary matrix U ? Of a Hermitian matrix,
H?
Solution
Concepts Involved: Linear Algebra, Polar Decomposition, Positive Operators, Unitary Operators, Her-
mitian Operators
If P is a positive matrix, then no calculation is required; P =√IP = P I√is the polar decomposition
√ (as I is
√
unitary and P is positive). If U is a unitary matrix, then J = U † U = I = I and K = U U † = I = I
so the polar decomposition is U = U I = IU (where U is unitary and I is positive). If H is hermitian, we
then have that:
√ √
sX X
J = H †H = H 2 = λ2i |iihi| = |λi | |iihi|
i i
√
and K = HH † = i |λi | |iihi| in the same way. Hence the polar decomposition is H = U i |λi | |iihi| =
P P
P
i |λi | |iihi| U .
36
Exercise 2.49
Express the polar decomposition of a normal matrix in the outer product representation.
Solution
Concepts Involved: Linear Algebra, Polar Decomposition, Outer Products
P
Let A be a normal matrix. Then, A has spectral decomposition A = i λi |iihi|. Therefore, we have that:
XX XX X 2
A† A = AA† = λi λ∗i0 |iihi| i0 i0 = λi λ∗i0 |iihi0 | δii0 = |λi | |iihi|
i i0 i i0 i
P
and K = i |λi | |iihi| identically. Furthermore, U is unitary, so it also has a spectral decomposition of
P
j µj |jihj|. Hence we have the polar decomposition in the outer product representation as:
A = U J = KU
X X X X
A=U |λi | |iihi| = |λi | |iihi| U
i j i j
XX XX
A= µj |λi ||jihj|iihi| = |λi |µj |iihi|jihj|
j i i j
Exercise 2.50
Find the left and right polar decompositions of the matrix
" #
1 0
1 1
Solution
Concepts Involved: Linear Algebra, Polar Decomposition.
" #
1 0 √
Let A = . We start with the left polar decomposition, and hence find J = A† A. In order to do
1 1
37
this, we find the spectral decompositions of A† A and AA† .
" #" # " # " #
1 1 1 0 λ 0 = 0 =⇒ det 2 − λ 1
det A† A − Iλ = 0 =⇒ det − =0
0 1 1 1 0 λ 1 1−λ
=⇒ λ2 − 3λ + 1 = 0
√ √
3+ 5 3− 5
=⇒ λ1 = , λ2 =
2 2
Solving for the eigenvectors, we have:
" √ # " √ #
2 − 3+2 5 1 √ 1+ 5
|v1 i = 0 =⇒ |v1 i =
1 1 − 3+2 5 2
" √ # " √ #
2 − 3−2 5 1 √ 1− 5
|v2 i = 0 =⇒ |v2 i =
1 1 − 3−2 5 2
Normalizing, we get:
√ #
" √ #
"
1 1+ 5 1 1− 5
|v1 i = p √ , |v2 i = p √
10 + 2 5 2 10 − 2 5 2
The last equality is not completely trivial, but the algebra is tedious so we invite the reader to use a
symbolic calculator, as we have. We make the observation that:
A = U J =⇒ U = AJ −1
So calculating J −1 , we have:
" #
−1 1 1 1 2 −1
J = √ |v1 ihv1 | + √ |v2 ihv2 | = √
λ1 λ2 5 −1 3
Where we again have used the help of a symbolic calculator. Calculating U , we then have that:
" # " # " #
−1 1 0 1 2 −1 1 2 −1
U = AJ = √ =√
1 1 5 −1 3 5 1 2
38
Hence the left polar decomposition of A is given by:
" # " #
1 2 −1 1
√ 3 1
A = UJ = √
5 1 2 5 1 2
We next solve for the right polar decomposition. We could repeat the procedure of solving for the spectral
decomposition of AA† , but we take a shortcut; since we know the K that satisfies:
A = KU
will be unique, and U is unitary, we can simply multiply both sides of the above equation on the right by
U −1 = U † to obtain K. Hence:
" # " # " #
† 1 0 1 2 1 1 2 1
K = AU = √ =√ .
1 1 5 −1 2 5 1 3
Exercise 2.51
Verify that the Hadamard gate H is unitary.
Solution
Concepts Involved: Linear Algebra, Unitary Operators
We observe that:
" # " # " #
† 1 1 1 1 1 1 1 0
H H=√ √ =
2 1 −1 2 1 −1 0 1
Remark: The above calculation shows that H is also Hermitian and Idempotent.
Exercise 2.52
Verify that H 2 = I.
39