0% found this document useful (0 votes)
268 views12 pages

3.4 Lda

Linear Discriminant Analysis (LDA) is a dimensionality reduction technique used for supervised classification problems. LDA creates a new axis that maximizes separation between classes by maximizing between-class variance and minimizing within-class variance.

Uploaded by

Javada Javada
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
268 views12 pages

3.4 Lda

Linear Discriminant Analysis (LDA) is a dimensionality reduction technique used for supervised classification problems. LDA creates a new axis that maximizes separation between classes by maximizing between-class variance and minimizing within-class variance.

Uploaded by

Javada Javada
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

linear discriminant

OR
analysis
Normal Discriminant Analysis
OR
Discriminant Function Analysis
• Linear Discriminant Analysis (LDA) is one of the commonly used dimensionality reduction
techniques in machine learning to solve more than two-class classification problems.
• It is also known as Normal Discriminant Analysis (NDA) or Discriminant Function Analysis
(DFA).
• Used to project the features of higher dimensional space into lower-dimensional space in
order to reduce resources and dimensional costs.
• It is applicable for more than two classes of classification problems.
• Linear Discriminant analysis is one of the most popular dimensionality reduction techniques
used for supervised classification problems in machine learning.
• If we have two classes with multiple features and need to separate them efficiently.
• When we classify them using a single feature, then it may show overlapping.

• Solution : Increase the number of features regularly.


Example: To classify two different classes having two sets of data
points in a 2-dimensional plane

• It is impossible to draw a straight line in a 2-d


plane that can separate these data points
efficiently.

• Solution : LDA (Linear Discriminant Analysis)


is used which reduces the 2D graph into a 1D
graph in order to maximize the separability
between the two classes

• Ie LDA uses both the axes (X and Y) to create a


new axis and projects data onto a new axis
How LDA works?

• LDA is used as a dimensionality reduction technique in machine learning, using


which we can easily transform a 2-D and 3-D graph into a 1-dimensional plane.

• LDA uses an X-Y axis to create a new axis by separating them using a straight line
and projecting data onto a new axis.

• Hence, we can maximize the separation between these classes and reduce the 2-D
plane into 1-D.

• To create a new axis, LDA uses the
following criteria:

1. It maximizes the distance between means of


two classes.
2. It minimizes the variance within the
individual class.
• Ie new axis will increase the separation between the data points of the
two classes and plot them onto the new axis.
• We have two classes and a d- dimensional examples such as x1, x2 … xn,
where:n1 samples coming from the class (c1) and n2 coming from the class
(c2)
Why LDA?

• LDA handles multiple classification problems with well-separated classes


• LDA can also be used in data pre-processing to reduce the number of features,
just as PCA, which reduces the computing cost significantly.
• LDA is also used in face detection algorithms -to extract useful data from
different faces
Drawbacks

• LDA also fails in some cases where the Mean of the distributions is shared. In
this case, LDA fails to create a new axis that makes both the classes linearly
separable.
• To overcome such problems, we use non-linear Discriminant
analysis in machine learning
Extensions to LDA:

1.Quadratic Discriminant Analysis (QDA): Each class uses its own estimate
of variance (or covariance when there are multiple input variables).
2.Flexible Discriminant Analysis (FDA): Where non-linear combinations of
inputs are used such as splines.
3.Regularized Discriminant Analysis (RDA): Introduces regularization
into the estimate of the variance (actually covariance), moderating the influence
of different variables on LDA.
Real-world Applications of LDA

Face Recognition
Medical -classifying the patient disease on the basis of various parameters of
patient health and the medical treatment
Customer Identification -specify the group of customers who are likely to
purchase a specific product in a shopping mall
For Predictions -"will you buy this product” ??
In Learning -robots are being trained for learning and talking to simulate
human work

You might also like