0% found this document useful (0 votes)
12 views41 pages

U2 Ot 2

The document discusses least-squares regression, highlighting its application to inexact experimental data and the inadequacy of polynomial interpolation for such cases. It introduces weighted least squares estimation, emphasizing the importance of assigning different weights to measurements based on their reliability. Additionally, it touches on recursive least squares estimation, focusing on minimizing the sum of variances of estimation errors.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views41 pages

U2 Ot 2

The document discusses least-squares regression, highlighting its application to inexact experimental data and the inadequacy of polynomial interpolation for such cases. It introduces weighted least squares estimation, emphasizing the importance of assigning different weights to measurements based on their reliability. Additionally, it touches on recursive least squares estimation, focusing on minimizing the sum of variances of estimation errors.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

Least-Squares Regression

• Let us suppose that the given data , is inexact and has substantial error, right from their source
where they are obtained.
• Experimental data is usually scattered and is a good example for inexact data. Polynomial interpolation is inappropriate
in such cases.
• To understand this let us look at the following graphical representation of some scattered data:

Fig :1(a) Scattered Data: tells us that the data has increasing trend i.e. higher values of y are associated with higher values of x.
Fig :1(b) A polynomial fit oscillating beyond the range of the data: we fit an eighth order interpolation polynomial, it passes
through the data exactly but oscillates due to the scattered nature of data and also goes well beyond the range suggested by data
Fig :1(c) An approximate fit for data: a more appropriate way is to find a function as shown in fig, that fits the shape or general
trend of the data
Least Square Method:
Least Square Fit of a Straight Line
Least Square fit of a parabola
WEIGHTED LEAST SQUARES ESTIMATION

• In this case, we have more confidence in some measurements than others. In this case, we need to generalize
the results of the previous section to obtain weighted least squares estimation.

• For example, suppose we have several measurements of the resistance of an unmarked resistor. Some of the
measurements were taken with an expensive multimeter with low noise, but other measurements were taken
with a cheap multimeter by a tired student late at night.

• We have more confidence in the first set of measurements, so we should somehow place more emphasis on
those measurements than on the others.

• However, even though the second set of measurements is less reliable, it seems that we could get at least some
information from them.
• To put the problem in mathematical terms, suppose x is a constant but unknown n-element
vector, and y is a k-element noisy measurement vector.

• We assume that each element of y is a linear combination of the elements of x with the addition of
some measurement noise, and the variance of the measurement noise may be different for each
element of y:

We assume that the noise for each measurement is zero-mean and independent. The measurement covariance matrix
is
RECURSIVE LEAST SQUARES ESTIMATION

A linear recursive estimator can be written in the form


The optimality criterion that we choose to minimize is the sum of the variances of the estimation errors at time k:
since both expected values are zero

You might also like