Lecture 3 INNER PRODUCT SPACES
Ioana Luca
UPB - Dept. Metode si Modele Matematice
2011 2012
I. Luca (UPB)
Inner Product Spaces
2011 2012
1 / 33
Content
1 Inner product spaces
2 Orthonormal bases
3 The orthogonal complement
I. Luca (UPB)
Inner Product Spaces
2011 2012
2 / 33
1. Inner product spaces
Denition Let V be a real vector space. A function f V V is called an inner product on V, if it has the following properties (axioms of the inner product):
a) b) c) d)
f (u + v , w ) = f (u , w ) + f (v , w ) f (u, v ) = f (u, v ) f (u, v ) = f (v , u) , f (v , v ) 0 ; f (v , v ) = 0 v = 0
for all u, v , w V and for all . A vector space equipped with an inner product is an inner product space; a nite dimensional inner product space is a Euclidean vector space.
I. Luca (UPB)
Inner Product Spaces
2011 2012
3 / 33
1. Inner product spaces
Remark V may be a complex vector space, but axiom c) is: f (u, v ) = f (v , u) For this case V is a (complex) inner product space or unitary vector space; if, moreover, V is nite dimensional, it is called a complex Euclidean vector space. Remark Notations: a) b) c) d) f (u, v ) u v < u, v >
(u + v ) w = u w + v w (u) v = (u v ) u v = v u for V R and u v = v u for V v v 0 ; v v = 0 v = 0
I. Luca (UPB)
Inner Product Spaces
2011 2012
4 / 33
1. Inner product spaces
Examples n x (x 1 , . . . , x n ) , y (y 1 , . . . , y n ) x y x 1 y 1 + . . . + x n y n
R Cn
x (x1 , . . . , xn ), y (y1 , . . . , yn ) x y x1 y1 + . . . + xn yn
Mm,n ( ) A B tr ABT = in Mm,1 ( ): u v tr uvT =
R C
m n
mi=1 j =1
Aij Bij
Mm,n ( )
R Rmn
i= 1
ui vi ; Mm,1 ( )
n
R Rm
Mm,n ( ) A B tr AB =
C ([a, b]) continuous, real-valued functions on [a, b]: < f, g >
b a
i,j =1
Aij Bij
f (x)g (x) dx
C ([a, b]) continuous, complex-valued functions on [a, b]: < f, g >
I. Luca (UPB)
b a
f (x)g (x) dx
2011 2012 5 / 33
Inner Product Spaces
1. Inner product spaces
R n [X ]
P p0 + p1 X + . . . + pn X n Q q0 + q1 X + . . . + qn X n P Q p 0 q0 + p 1 q1 + . . . + p n qn
R[X ] P Q
1 1
P (x)Q(x) dx u v cos 0 if u = 0 and v = 0 if u = 0 or v = 0
V3 the space of free vectors: u v
v u
I. Luca (UPB) Inner Product Spaces
[0, ]
2011 2012
6 / 33
1. Inner product spaces
2
( ) the space of square-summable complex sequences:
2
( ) (xn )nN
xn
C,
n=0
xn 2 <
n=0
x (xn )nN ,
y (yn )nN
xy
x n yn
Proposition (Properties of an inner product) 1) u (v ) = (u v ), 0 v = 0 2) u v = 0 for all v V u = 0 3) if {e1 , . . . , en } is a basis of V, then u ei = 0 for i = 1, . . . , n u=0
I. Luca (UPB)
Inner Product Spaces
2011 2012
7 / 33
1. Inner product spaces
Denition 1) v v v the norm (induced by the inner product) of v 2) v = 1 v is called versor or unit vector Proposition 1) v 0 and 2) v = v Examples v = 0 v = 0
Rn Cn
x = x =
2 x2 1 + . . . + xn
x1 2 + . . . + xn 2
Inner Product Spaces 2011 2012 8 / 33
I. Luca (UPB)
1. Inner product spaces
Mm,n ( ) Mm,n ( )
R C
A = tr AAT = A = tr AA =
T
m n i=1 j=1 n i,j=1
A2 ij
2
Aij
C ([a, b]) continuous, real-valued functions on [a, b]: f =
b a 1 2
f (x)2 dx
C ([a, b]) continuous, complex-valued functions on [a, b]: f = if v = 0
b a 1 2
f (x) 2 dx
V3 v = v ; i, j , k are unit vectors.
1 v
v is a unit vector
Inner Product Spaces 2011 2012 9 / 33
I. Luca (UPB)
1. Inner product spaces
Proposition (Cauchy-Buniakovsky-Schwarz inequality) u v u v , u, v V ;
equality holds if and only if u and v are linearly dependent vectors. Exercise Prove this proposition by expanding Examples u 2 v (v u)u
2
0.
Rn
C ([a, b])
n i=1
xi yi
b a 2
n i=1
x2 i
n i=1 2
2 yi
f (x)g (x) dx
n i=1
n i=1
b a
f (x)2 dx
b a
g (x)2 dx
Cn
n i= 1
xi yi
xi 2
yi 2
2011 2012 10 / 33
I. Luca (UPB)
Inner Product Spaces
1. Inner product spaces
Denition 1) The angle between the non-zero vectors u, v : u v [0, ] , cos = . u v 2) If u v = 0 the vectors u, v are orthogonal. Examples In n the vectors of the standard basis are mutually orthogonal: ei ej = 0 , i = j . In V3 : i j = 0, i k = 0, j k = 0.
Proposition 1) Triangle inequality: u + v u + v 2) Pythagoras theorem: u v = 0 u v 2 = u 2 + v 2 3) Parallelogramme law: u + v 2 + u v 2 = 2 ( u 2 + v 2 )
I. Luca (UPB) Inner Product Spaces 2011 2012 11 / 33
1. Inner product spaces
Proposition Let v1 , . . . , vn be given in V R and denote vij vi vj . Then, 1) the Gram matrix (vij ) is symmetric: vij = vji 2) {v1 , . . . , vn } is a linearly independent set det(vij ) = 0. Example If v1 , . . . , vn are mutually orthogonal and v1 , . . . , vn = 0 , then v1 , . . . , vn are linearly independent: the Gram matrix (vi vj ) is 0 ... 0 v1 v1 0 v2 v2 . . . 0 0 0 . . . vn vn ,
with v1 v1 = 0 , . . . , vn vn = 0.
Exercise Note that AT A is the Gram matrix corresponding to the columns of A Mm,n ( ), and deduce that, if the columns of A are linearly independent, then AT A is invertible.
I. Luca (UPB)
Inner Product Spaces
2011 2012
12 / 33
2. Orthonormal bases
Denition Let S be a non-empty subset of an inner product space V. 1) S is orthogonal if u v = 0 for all u, v S, u = v . 2) S is orthonormal if it is orthogonal and v = 1, for any v S. 3) An orthonormal basis of V is a basis which is an orthonormal set. Examples The trigonometric system 1, sin x, cos x, sin 2x, cos 2x, . . . , sin nx, cos nx, . . . is an orthogonal set in C ([0, 2 ]), and 1 sin x cos x sin nx cos nx , , ,..., , ,... 2 is orthonormal.
I. Luca (UPB) Inner Product Spaces 2011 2012 13 / 33
2. Orthonormal bases
The canonical basis of The basis {i, j , k} of V3 is orthonormal.
Rn is an orthonormal basis. R
If {e1 , . . . , en , . . .} is an orthonormal set, then ei ej = ij . Exercise Show that Q Mn ( ) is orthogonal, i.e., Q1 = QT , (Q Mn ( ) is unitary, i.e., Q are orthonormal. Proposition 1) If S is orthogonal and 0 S, S is linearly independent. 2) Any orthonormal set is linearly independent. 3) If {e1 , . . . , en } is an orthonormal basis of V R (V C ) and fj = n j = 1, . . . , n, then i=1 Cij ei , {f1 , . . . , fn } is an orthonormal basis the matrix (Cij ) is orthogonal (unitary)
I. Luca (UPB) Inner Product Spaces 2011 2012 14 / 33
= Q ) if and only if its columns/rows
2. Orthonormal bases
Proposition If {e1 , . . . , en } is an orthonormal basis of V representations u=
n i=1
R , and u, v V have the
vi e i ,
ui ei ,
v=
i=1
the components of v with respect to this basis, the inner product u v , and the norm of v are given by vi = v ei , respectively. Exercise Restate and prove the preceding proposition for the case V C.
I. Luca (UPB) Inner Product Spaces 2011 2012 15 / 33
uv =
n i=1
ui vi ,
n i= 1
2 vi ,
2. Orthonormal bases
Example The components of v = (1, 2, 1) orthonormal basis e1 = are as follows: v1 = v e 1 = 1 3, v2 = v e2 = 7 3, v3 = v e3 = 2 3.
1 2 2 3, 3, 3
R3 with respect to the
e3 =
2 1 2 3, 3, 3
2 1 e2 = 3 , 2 3, 3 ,
Proposition (Gram-Schmidt orthonormalization process) If {f1 , . . . , fn , . . .} V is a linearly independent set, there exists an orthonormal set {e1 , . . . , en , . . .} V such that Sp [ e1 , . . . , en ] = Sp [ f1 , . . . , fn ] , for each n = 1, 2, . . . . It is deduced as follows:
I. Luca (UPB) Inner Product Spaces 2011 2012 16 / 33
2. Orthonormal bases
Step 1 One determines the orthogonal set {d1 , d2 , . . . , dn , . . .} of non-zero vectors: d1 f1 d2 f2 + d1 , Sp [ d1 , . . . , dn ] = Sp [ f1 , . . . , fn ] Step 2 One normalizes the vectors d1 , d2 , . . . , dn , . . .: e1 1 d1 , d1 e2 1 d2 , d2 ..., en 1 dn , dn ... where satises d2 d1 = 0 d3 d1 = 0, d3 d2 = 0 d3 f3 + 1 d1 + 2 d2 , where 1 , 2 satisfy
I. Luca (UPB)
Inner Product Spaces
2011 2012
17 / 33
2. Orthonormal bases
Example In
R3, starting from the basis {f1, f2, f3}, where
f2 (1, 0, 1), f3 (5, 3, 7) ,
f1 (1, 2, 2),
we nd an orthonormal basis by using the Gram-Schmidt algorithm. Step 1 d1 f1 , d2 f2 + d1 , d3 f3 + 1 d1 + 2 d2 , where , 1 , 2 are such that d2 d1 = 0, d3 d1 = 0, d3 d2 = 0 f2 d1 1 f3 d1 1 f3 d2 = = , 1 = = , 2 = = 1 . d1 d1 3 d1 d1 3 d2 d2 Therefore,
2 2 1 d2 = f2 + 1 3 d1 = 3 , 3 , 3 , 1 d3 f3 + 3 d1 d2 = (6, 3, 6) .
Step 2 The orthonormal basis consists of the vectors d1 d2 d3 2 2 2 1 e1 = 1 = 2 = 3 , 3 , 3 , e2 d 3 , 3 , 3 , e3 d d1 2 3
I. Luca (UPB) Inner Product Spaces
2 1 2 3, 3, 3
18 / 33
2011 2012
2. Orthonormal bases
Proposition (The QR decomposition) Suppose the columns of A Mm, n ( ) are linearly independent. Then A = QR, where the columns of Q Mm, n ( ) are orthonormal, and R Mn ( ) is an invertible upper triangular matrix.
Proof Mm,1
Rm the columns of A:
v2 (A12 , A22 , . . . , Am2 ) , . . . , vn (A1n , A2n , . . . , Amn )
v1 (A11 , A21 , . . . , Am1 ) ,
Since v1 , . . . , vn are linearly independent, by using the Gram-Schmidt algorithm one obtains an orthonormal set {e1 , . . . , en } such that Sp [e1 , . . . , ek ] = Sp [v1 , . . . , vk ] , k = 1, . . . , n . In particular this shows that ek+1 , . . . , en are orthogonal to v1 , . . . , vk . Thus, v1 , . . . , vn Sp [e1 , . . . , en ] have the following representations with respect to the orthonormal basis {e1 , . . . , en } of Sp [e1 , . . . , en ]:
I. Luca (UPB) Inner Product Spaces 2011 2012 19 / 33
2. Orthonormal bases
v1 = v2 = vn =
n i=1 n
(v1 ei )ei = (v1 e1 )e1 , (v2 ei )ei = (v2 e1 )e1 + (v2 e2 )e2 , (vn ei )ei = (vn e1 )e1 + . . . + (vn en )en .
i=1 n
i=1
This implies A = QR, where e1 en , v1 e1 0 R= 0 v2 e1 v2 e2 0 ... ... ... vn e1 vn e2 . vn en
Q=
Moreover,
I. Luca (UPB) Inner Product Spaces 2011 2012 20 / 33
2. Orthonormal bases
v1 e1 = d1
d1 d2 = d1 = 0 , v2 e2 = (d2 d1 ) = d2 = 0 , . . . d1 d2
proving that the matrix R is invertible. Example 1 2 A 0 1 1 4 The columns v1 (1, 0, 1) , v2 (2, 1, 4) are linearly independent, and hence the factorization A = QR does exist. We start from {v1 , v2 } and use the Gram-Schmidt algorithm: d1 v1 ,
I. Luca (UPB)
d2 v2 + d1 ,
Inner Product Spaces
d2 d1 = 0
2011 2012 21 / 33
2. Orthonormal bases
This gives = implying e1 = 1 d1 = d1
1 2
v2 d1 = 3 d1 d1
d1 = (1, 0, 1) ,
d2 = (1, 1, 1) ,
(1, 0, 1) ,
e2 =
1 d2 = d2
1 3
(1, 1, 1) .
Thus, the matrices Q and R are given by 2 Q= 0 1
1 2 1 3 1 , 3 1 3
R=
v1 e1 v2 e1 0 v2 e2
2 3 2 = . 3 0
I. Luca (UPB)
Inner Product Spaces
2011 2012
22 / 33
3. The orthogonal complement
Denition Let U be a vector subspace of an inner product space V. The subset U of V consisting of all vectors which are orthogonal to any vector in U, U { v V v u = 0 , u U } , is called the orthogonal complement of the subspace U. Example In
R2 we consider the vector subspace U Sp [ (2, 1) ] = { (2, ) R } . R} =
U
The orthogonal complement of U is the set U = { (x, y ) 2 (x, y ) (2, ) = 0 , = { (x, y )
R2
U
(1, 2) 90 (2, 1)
y = 2x } =
= { x(1, 2) x
I. Luca (UPB)
R } = Sp [ (1, 2) ]
Inner Product Spaces
x
2011 2012 23 / 33
3. The orthogonal complement
Proposition 1) U is a vector subspace of V. 2) U U = {0}. 3) If U is nite dimensional and {e1 , . . . , ek } is a basis of U, then v U v ei = 0 , i = 1, . . . , k . 4) If U is nite dimensional, then V = U U . Remark Consequence of 4): any vector v V has the unique decomposition v = u + u , u U, u U ; u is called the orthogonal projection of v onto U.
I. Luca (UPB)
Inner Product Spaces
2011 2012
24 / 33
3. The orthogonal complement
Remark Rule for nding the orthogonal projection u: If {e1 , . . . , ek } is a basis of U, then u is known if its coordinates u1 to uk with respect to this basis are known. We have v = u1 e1 + . . . + uk ek + u , and taking the inner product of v with e1 , . . . , ek we obtain (e1 e1 )u1 + (e2 e1 )u2 + . . . + (ek e1 )uk = v e1 , (1 ) (e1 ek )u1 + (e2 ek )u2 + . . . + (ek ek )uk = v ek .
The Gram determinant corresponding to {e1 , . . . , ek } is non-zero unique solution u1 , . . . , uk . If {e1 , . . . , ek } is an orthonormal basis of U, (1) gives u=
I. Luca (UPB)
i=1
(v ei )ei
(2 )
2011 2012 25 / 33
Inner Product Spaces
3. The orthogonal complement
Example In
R2:
U
v u
90
If e.g. U [(2, 1)] and v = (5, 6), an orthonormal basis of U is {e}, = Sp e (2 5, 1 5). Thus, using (2), the orthogonal projection u of v onto U emerges as u = (v e)e = (32 5, 16 5). Since u = v u, the decomposition of v along U and U is v = u + u = (32 5, 16 5) + (7 5, 14 5).
I. Luca (UPB) Inner Product Spaces 2011 2012 26 / 33
3. The orthogonal complement
Proposition If u is the orthogonal projection of v V on U, then v u v u , u U ;
moreover, if for a some u U we have v u = v u , then u = u. In other words,
u U
min v u = v u ,
(3)
and this minimum is attained only by the orthogonal projection of v onto U. Remark v u = d(v , u) Reformulation of (3): among all the elements of U, the closest one to v V is the orthogonal projection u of v on U. Moreover, u is the unique element of U closest to v ; it is also called the element in U of best approximation for v .
I. Luca (UPB) Inner Product Spaces 2011 2012 27 / 33
3. The orthogonal complement
Example In the vector space C ([0, 2 ]) with the inner product < f, g > we consider the subspace U Sp [ e0 , e1 , e2 , . . . , e2n1 , e2n ] , where 1 sin x cos x sin nx cos nx e0 , e1 , e2 , . . . , e2n1 , e2n . 2 Since {e0 , e1 , e2 , . . . , e2n1 , e2n } is an orthonormal basis of U, the element in U of best approximation for f C ([0, 2 ]) can be deduced according to (2). One obtains the trigonometric polynomial P (x) = c0 + c1 sin x + c2 cos x + . . . + c2n1 sin nx + c2n cos nx , where
I. Luca (UPB) Inner Product Spaces 2011 2012 28 / 33
2 0
f (x)g (x) dx
(4 )
3. The orthogonal complement
c0 = 1 2
2 0
f (x) dx ,
c2k1 =
2 0
f (x) sin kx dx ,
c0 , . . . , c2n
2 1 f (x) cos kx dx , k = 1, . . . , n ; 0 are the Fourier coecients of f . In C ([a, b]) the norm
c2k =
f g
b a
(f (x) g (x))2 dx
is called the quadratic error. So, we have obtained that, among all trigonometric polynomials of at most degree n, P (x) given by (4) is the trigonometric polynomial of which the quadratic error with respect to f is minimal.
1
Exercise Find min
a,b
I. Luca (UPB)
(ex (a + bx))2 dx
Inner Product Spaces 2011 2012 29 / 33
3. The orthogonal complement
Example (The least squares method) Let y be a physical quantity with the following dependence on the variables x1 , . . . , xk : y = c1 x1 + . . . + ck xk . The parameters c1 , . . . , ck must be adjusted from the next table: y x1 x2 . . . y1 x11 x12 . . . y2 x21 x22 . . . yn xn1 xn2 . . . to best t the data set xk x1k x2k xnk
Usually, n > k , and the overdetermined system y = c1 x11 + . . . + ck x1k 1 (5) y = c x + . . . + c x 1 n1 k nk n is inconsistent (incompatible). Thus, the goal is to nd c1 , . . . , ck
I. Luca (UPB) Inner Product Spaces 2011 2012 30 / 33
3. The orthogonal complement
which render as small as possible the sum of squares of errors between the left- and right-hand sides of these equations:
n i=1
(yi (c1 xi1 + . . . + ck xik ))2 ;
(c1 , . . . , ck ) is called a pseudo-solution (in the sense of the least squares) of the linear system (5). Dening the vectors y , e1 , . . . , ek of n by
y (y1 , . . . , yn ) ,
e1 (x11 , . . . , xn1 ),
...,
ek (x1k , . . . , xnk ),
the least squares requirement can be restated as: nd c1 , . . . , ck which minimize y (c1 e1 + . . . + ck ek ) . Equivalently: nd u Sp [e1 , . . . , ek ] U, such that y u = min y u .
u U
Clearly, u is the orthogonal projection of y onto U. If e1 , . . . , ek are linearly independent (which is most likely to happen, due to
I. Luca (UPB) Inner Product Spaces 2011 2012 31 / 33
3. The orthogonal complement
measurement errors), {e1 , . . . , ek } is a basis of U, and hence (c1 , . . . , ck ) represents the (unique) solution of the system (e1 e1 )c1 + (e2 e1 )c2 + . . . + (ek e1 )ck = y e1 , (6 ) ( e e ) c + ( e e ) c + . . . + ( e e ) c = y e . 1 1 2 2 k k k k k k For the case k = 1, i.e., the model function is y = cx (a straight line), one registers the data (x1 , y1 ), . . . , (xn , yn ). With e (x1 , . . . , xn ), the preceding system reduces to (e e)c = y e, so that c=
n i=1 n
x i yi
i=1
x2 i
y
y=
cx
x
I. Luca (UPB) Inner Product Spaces 2011 2012 32 / 33
3. The orthogonal complement
Remark In matrix form the system (5) emerges as Ac = y, while (6) reads as AT Ac = AT y . (6 ) (5 )
We have shown that, if the columns of A are linearly independent, there is a unique pseudo-solution of (5 ), which is the (unique) solution of (6 ). The system (6 ) is called the normal system associated to (5 ). Exercise Find the pseudo-solution of the linear system
I. Luca (UPB)
c1 c2 + 3c3 = 1 2c1 + 3c2 = 3 c2 + c3 = 4 2c1 + 3c2 + 2c3 = 2 c1 + 4c2 + 3c3 = 1
Inner Product Spaces 2011 2012 33 / 33