0% found this document useful (0 votes)
23 views17 pages

U2 Ot 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views17 pages

U2 Ot 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

1

UNIT – 2
Adaptive Optimization Techniques
Least squares estimation: Estimation of a constant, Weighted least squares
estimation, Recursive least squares estimation, Maximum likelihood estimation
(MLE).
.

2
BOOKS
TEXT BOOKS:

1) “Engineering Optimization- Theory and Practice” by Third Edition, SINGIRESU S. RAO, A Wiley-
Interscience Publication John Wiley & Sons, Inc. 1996

2) "Optimal State Estimation Kalman, H- infinity, and Nonlinear Approaches" by Dan Simon, John
Wiley & Sons,2006.

3) Xin-She Yang, “Nature-Inspired Optimization Algorithms”, Elsevier, 1st edition, 2014.

4) "Introduction to Metaheuristics" by Fred Glover and Gary A. Kochenberger

5) "Introduction to Operations Research" by Frederick S. Hillier and Gerald J. Lieberman

REFERENCE BOOKS:

1) "Optimal Estimation of Dynamic Systems" by John L. Crassidis and John L. Junkins, CRC Press,
2004.

2) "Operations Research: An Introduction" by Taha Hamdy A.

3) "Numerical Optimization" by Jorge Nocedal and Stephen J. Wright

4) "Convex Optimization" by Stephen Boyd and Lieven Vandenberghe

5) "Machine Learning Algorithms" by Giuseppe Bonaccorso , 2017

3
Least Squares Estimation
Least Squares Estimation
 The least square method is the process of finding the best-
fitting curve or line of best fit for a set of data points by reducing
the sum of the squares of the offsets (residual part) of the points
from the curve.
 During the process of finding the relation between two
variables, the trend of outcomes are estimated quantitatively.
 This process is termed as regression analysis.
 The method of curve fitting is an approach to regression
analysis.
 This method of fitting equations which approximates the curves
to given raw data is the least squares.
Least Squares Estimation cont….
 It is quite obvious that the fitting of curves for a particular data
set are not always unique.
 Thus, it is required to find a curve having a minimal deviation
from all the measured data points.
 This is known as the best-fitting curve and is found by using the
least-squares method.
Least Square Method Definition
 The least-squares method is a crucial statistical method that is
practiced to find a regression line or a best-fit line for the given
pattern.
 This method is described by an equation with specific
parameters.
 The method of least squares is generously used in evaluation
and regression.
 In regression analysis, this method is said to be a standard
approach for the approximation of sets of equations having
more equations than the number of unknowns.
 The method of least squares actually defines the solution for the
minimization of the sum of squares of deviations or the errors
in the result of each equation.
 Find the formula for sum of squares of errors, which help to find
the variation in observed data.
Least Square Method Definition cont….

 The least-squares method is often applied in data fitting.


 The best fit result is assumed to reduce the sum of squared
errors or residuals which are stated to be the differences
between the observed or experimental value and corresponding
fitted value given in the model.
 There are two basic categories of least-squares problems:
◦ Ordinary or linear least squares
◦ Nonlinear least squares
 These depend upon linearity or nonlinearity of the residuals.
 The linear problems are often seen in regression analysis in
statistics.
 On the other hand, the non-linear problems are generally used
in the iterative method of refinement in which the model is
approximated to the linear one with each iteration.
Least Square Method Graph cont….

 In linear regression, the line of best fit is a straight line as shown


in the following diagram:
Least Square Method Graph cont….
 The given data points are to be minimized by the method of
reducing residuals or offsets of each point from the line.
 The vertical offsets are generally used in surface, polynomial
and hyperplane problems, while perpendicular offsets are
utilized in common practice.
Estimation of a Constant
Least Square Method Estimation of a Constant:
 In this section, we will determine how to estimate a constant on
the basis of several noisy measurements of that constant.
 For example,
 Suppose we have a resistor but we do not know its resistance. We
take several measurements of its resistance using a multimeter,
but the measurements are noisy because we have a cheap
multimeter.
 We want to estimate the resistance on the basis of our noisy
measurements.
 In this case, we want to estimate a constant scalar but, in general,
we may want to estimate a constant vector.
 To put the problem in mathematical terms, suppose x is a
constant but unknown n-element vector, and y is a k-element
noisy measurement vector.
Least Square Method Estimation of a Constant: cont….

 To put the problem in mathematical terms, suppose x is a


constant but unknown n-element vector, and y is a k-element
noisy measurement vector.
 How can we find the “best” estimate x̂ of x?
 Let us assume that each element of the measurement vector y is a
linear combination of the elements of x, with the addition of some
measurement noise:

 This set of equations can be put into matrix form as


Least Square Method Estimation of a Constant: cont….

 The measurement residual is given by

 the most probable value of the vector x is the vector x̂ that


minimizes the sum of squares between the observed values y and
the vector Hx ˆ .
 So we will try to compute the h that minimizes the cost function J,
where J is given

 In order to minimize J with respect to x̂ , we compute its partial


derivative and set it equal to zero:
Least Square Method Estimation of a Constant: cont….

 In order to minimize J with respect to x̂ , we compute its partial


derivative and set it equal to zero:

 Solving this equation for x̂ results in

 Therefore, Least Squares Estimation is computed by using,


Least Square Method Estimation of a Constant: cont….

 where H L, the left pseudo inverse of H, exists if k ≥ n and H is full


rank.
 This means that the number of measurements k is greater than
the number of variables n that we are trying to estimate, and the
measurements are linearly independent.
 In order to prove that we have found a minimum rather than
some other type of stationary point of J, we need to prove that
the second derivative of J is positive semi definite (see the next
example Problem ).
Example -1

You might also like