0% found this document useful (0 votes)
62 views2 pages

Jacobi

The document discusses Jacobi's Method for diagonalizing symmetric matrices, focusing on transforming a matrix A into a diagonal matrix D through iterative applications of rotation matrices. It also introduces Householder Reduction as a more efficient method for larger symmetric matrices, resulting in a tridiagonal matrix that retains the same eigenvalues. The process involves constructing orthogonal matrices to achieve a Hessenberg form, which simplifies the computation of eigenvalues.

Uploaded by

Shreya Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views2 pages

Jacobi

The document discusses Jacobi's Method for diagonalizing symmetric matrices, focusing on transforming a matrix A into a diagonal matrix D through iterative applications of rotation matrices. It also introduces Householder Reduction as a more efficient method for larger symmetric matrices, resulting in a tridiagonal matrix that retains the same eigenvalues. The process involves constructing orthogonal matrices to achieve a Hessenberg form, which simplifies the computation of eigenvalues.

Uploaded by

Shreya Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

48.

Jacobi’s Method for Diagonalizing a Symmetric Matrix


Let A be symmetric and apq the largest (in absolute value) off-diagonal entry.
With A0 = A, we want to find a matrix U of the form

p q i j

p cos θ sin θ 0 0

U= q − sin θ cos θ 0 0

i 0 0 1 0

j 0 0 0 1

such that the matrix


A1 = UT AU

has a1pq = 0. Note that UT U = I and A1 is symmetric.


Choose θ such that − π4 < θ ≤ π4 and

2 sin θ cos θ 2apq


tan 2θ = = .
cos2 θ − sin2 θ aqq − app

Then, using the abbreviations c = cos θ and s = sin θ, we have A1 =

p q i j

p c2 app − 2csapq + s2 aqq cs(app − aqq ) capi capj

+(c2 − s2 )apq −saqi −saqj

q cs(app − aqq ) + (c2 − s2 )apq s2 app + 2csapq sapi + caqi sapj

+c2 aqq +caqj

i caip − saiq saip + caiq aii aij

j cajp − sajq sajp + cajq aji ajj

By the choice of θ we have a1pq = 0. We may iterate the process, forming a sequence
of matrices A0 , A1 , A2 , . . . until the off diagonal entries are sufficiently small. The
limit of this sequence is a diagonal matrix D = diag(λ1 , . . . , λn ).

49. Householder Reduction


For larger symmetric matrices, say bigger than 10 × 10, there is a more efficient
method for finding the eigenvalues. This method is somewhat more complicated
than the Jacobi algorithm, but it is widely used. It is based on the following result.
1
2

Householder Reduction Theorem. For any square matrix A there exists an


orthogonal matrix U such that B = UT AU has bij = 0 for j ≤ i − 2, something
like  
∗ ∗ ∗ ∗ ∗
∗ ∗ ∗ ∗ ∗
 
UT AU =  0 ∗ ∗ ∗ ∗
 
0 0 ∗ ∗ ∗
0 0 0 ∗ ∗
This is called Hessenberg form. If A is symmetric, then B is tridiagonal and
symmetric.
Outline of proof. We will define n − 2 orthogonal matrices P1 , . . . , Pn−2 . Each of
these will have the special form

Pi = I − 2wi wiT

for a suitably chosen vector wi of length 1. Let

A1 = A
A2 = P1 A1 PT1
..
.
An−1 = Pn−2 An−2 PTn−2 .

Since PTi = P−1


i , these matrices are all similar and hence have the same eigenvalues.
Moreover,

An−1 = Pn−2 . . . P1 APT1 . . . PTn−2


= UAUT .

If we choose the wi ’s correctly, we can get B = An−1 to have zeroes below the
subdiagonal. ¤
If A is symmetric, then after the Householder reduction we have the tridiagonal
matrix  
a1 b1 0
 b1 a2 b2 
 .. .. .. 
An−1 =   . . . .

 bn−2 an−1 bn−1 
0 bn−1 an

You might also like