Introduction to Linear
Discriminants and Linear
        Discriminants for
Classification in Machine
                 Learning
            CONTENTS..!
• What are Linear Discriminants?
• Linear Discriminant Analysis (LDA)
• Linear Discriminants for Classification
• Applications of LDA in Machine Learning
• Summary & Conclusion
             What are Linear
             Discriminants?
•Linear Discriminants are mathematical functions that aim to separate
different classes in a dataset using a linear combination of features.
•The goal is to project data points in a way that maximizes the
separation between classes while maintaining compactness within
each class.
Key Characteristics:
•Maximizing class separability.
•A linear transformation of the feature space.
•Useful for both dimensionality reduction and classification tasks.
Linear Discriminant Analysis
                        (LDA)
 •Linear Discriminant Analysis (LDA) is a statistical method
  used for:
  1.Dimensionality reduction and feature extraction.
  2. Classification, particularly for datasets where classes are well-
  separated.
•Objective: Find a linear combination
of features that best separates
multiple classes.
•Mathematical Concept:
  • Compute class means, within-class
    scatter matrix, and between-class
    scatter matrix.
  • Derive a linear transformation that
    projects data into a lower-
    dimensional space while
  HOW LDA WORKS..!
• Step 1: Calculate the Mean of each class in the dataset.
• Step 2: Compute scatter matrices:
   •   Within-class scatter matrix
   •   Between-class scatter matrix
• Step 3: Compute eigenvalues and eigenvectors to
 determine the transformation that maximizes separability.
• Step 4: Select the eigenvectors corresponding to the largest
 eigenvalues to form the new feature space.
• Step 5: Project the original dataset into the new feature
 space and classify based on the transformed data.
         EXAMPLE…!
COVARIANCE
                     MEANS
         EXAMPLE…!
COVARIANCE
                     WITHIN MATRIX
          EXAMPLE…!
                    EIGEN VALUES
IN BETWEEN MATRIX
          EXAMPLE…!
                    EIGEN VALUES
IN BETWEEN MATRIX
          EXAMPLE…!
     EIGEN VECTORS..
IN BETWEEN MATRIX
EXAMPLE…!
     ADVANTAGES..!
• Simple and Interpretable: LDA creates a
 clear linear boundary, making it easy to
 understand.
• Effective with Smaller Datasets: Particularly
 suitable for problems with smaller datasets and
 clear class separation.
• Dimensionality Reduction: Can reduce the
 number of features while preserving the most
 important information for classification.
Applications of LDA in Machine
           Learning
•Face Recognition: LDA helps identify individuals by analyzing
key facial features in an image dataset.
•Medical Diagnostics: LDA can differentiate between healthy and
diseased cells, aiding in diagnosis (e.g., cancer detection).
•Text Classification: Used to categorize documents into
predefined classes such as spam or non-spam.
•Speech Recognition: LDA can assist in classifying spoken
words or sentences by transforming audio features.
      LIMITATIONS..!
• Not Suitable for Non-Linear Data: LDA
 performs poorly when the data is non-linearly
 separable. For complex data, other techniques
 like Support Vector Machines (SVM) or Neural
 Networks might be more effective.
      CONCLUSION..!
• Linear Discriminants are useful tools for classification and
  dimensionality reduction.
• Linear Discriminant Analysis (LDA) is a powerful method for
  finding a linear separation between classes, assuming that the data
  meets the normality and equal covariance assumptions.
• LDA is widely used in fields such as face recognition, medical
  diagnostics, and text classification.
• Understanding the assumptions and limitations of LDA is crucial
  for its successful application.