0% found this document useful (0 votes)
3 views3 pages

hw12

This document outlines Homework 12 for Math 21b: Linear Algebra, focusing on orthogonality and related concepts. It includes problems on finding angles between vectors, expectations, correlation coefficients, best linear fits, and orthogonal bases. Additionally, it discusses the properties of orthogonal and orthonormal bases, including projection matrices.

Uploaded by

moien
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views3 pages

hw12

This document outlines Homework 12 for Math 21b: Linear Algebra, focusing on orthogonality and related concepts. It includes problems on finding angles between vectors, expectations, correlation coefficients, best linear fits, and orthogonal bases. Additionally, it discusses the properties of orthogonal and orthonormal bases, including projection matrices.

Uploaded by

moien
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Math 21b: Linear Algebra Spring 2016

Homework 12: Orthogonality


This homework is due on Monday, February 29, respectively on Tuesday, March 1, 2016.
   
1  

 0 
 
1   1 
   
   

  
1 a) Find the angle between ~v = −1  and w ~ =  0 .
   
  


−1   1 
   

  
  
0  0
   

 1 
 
 2 
 

b) What is the length of the vector  .. ? Remark: This is known
 

 . 
 
24
 

to be only vector of the form [1, . . . , n] with n > 1 and integer


length.
 
x1 

2 A vector ~x = ...  encodes data (x1, . . . , xn). The expecta-



 



xn
 

tion of x is the average m = (x1 + · · · + xn)/n. The vector


 x1 − m 

X = 


 .
.
.

 is the centered form of x. Its expectation is



xn − m
 

zero. If X, Y are the centered versions of x, y then (X · Y )/n


is called the covariance of X and Y and (X · X)/n the variance
of X and |X| the standard deviation and the cosine of the angle
between X and Y is the correlation coefficient. The correlation
coefficient is the same. Yes, vectors are random variables. We
   
1 


 2 
 
2   3 
   
   

  
work with ~x = 3  and ~y =  5 .
   
  


4   7 
   

  
  
5 11
   

a) Find the expectations of ~x, ~y .


b) Find the correlation coefficient between ~x, ~y .

3 If ~x, ~y are two vectors, we get data points (x1, y1), (x2, y2), ...(xn, yn)
in the plane. The line y = ax + b is called the best linear fit.
We have b = E[y] − aE[x], where a = Cov[X, Y ]/|X|2. Draw the
5 data points from problem 2 and find the best fit y = ax + b.
4 An orthogonal basis in Rn for which every vector has either entries
−1 or 1 is called a Walsh basis. The corresponding
 matrix is a
 1 1 1 1 

 1 −1 1 −1
 

Walsh matrix. Check that the columns of W =   /2
 

 1 1 −1 −1



 
1 −1 −1 1
 

form an orthonormal basis. Verify that one can get from W a


8 × 8 matrix encoding
 an orthonormal basis in R8 by scaling
 W W 
A =   in the right way.
W −W
(Joseph Walsh graduated from Harvard in 1916, taught here from
1935-1966. Here is an open problem: nobody knows whether there
is an orthogonal basis of vectors with entries −1, 1 in R668. The
corresponding matrices are called Hadamard matrices. The
Walsh matrices above allow to construct examples for n = 2m.)
5 Use an orthogonal basis of x + y + z = 0 to find the matrix of
the projection onto that plane. The formula for that projection
matrix is given below.
Orthogonality

Two vectors  are  orthogonal if ~v · w ~ =


 w1 
 
 ... 
 

v1, v2, . . . , vn  = v1w1 + . . . + vnwn = 0.
 

 ... 


 
wn
 


The length of a vector is is |~v | = ~v · ~v . The vector
~v /|~v | is called a unit vector. The Cauchy-Schwarz
inequality |~v · w| ~ ≤ |~v ||w| ~ allows to define the angle α
by cos(α) = (~v · w)/(|~ ~ v | · |w|).
~ The number cos(α) is
called the correlation coefficient. If it is positive,
the vectors are positively correlated, if it is negative
they are negatively correlated. Orthogonal vectors
are uncorrelated. A basis is an orthonormal basis, if all
vectors are perpendicular and have length 1. If they are
just orthogonal, they form an orthogonal basis. If we have
an orthonormal basis of V , and Q be the matrix containing
the basis vectors as column vectors. then the projection
onto the space V is given by the matrix P = QQT , where
QT is the transpose matrix.

You might also like