0% found this document useful (0 votes)
24 views7 pages

Unit 1.7 Linear Discriminants

The document discusses Linear Discriminant Analysis (LDA) as a supervised machine learning technique for multi-class classification, emphasizing its role in dimensionality reduction and optimization of models. It also explains the Perceptron, a binary classification neural network introduced by Frank Rosenblatt, detailing its structure, components, and functioning. Additionally, it outlines the differences between Single Layer and Multi Layer Perceptrons, highlighting their respective capabilities in handling linear and complex data patterns.

Uploaded by

saranrakshu27
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views7 pages

Unit 1.7 Linear Discriminants

The document discusses Linear Discriminant Analysis (LDA) as a supervised machine learning technique for multi-class classification, emphasizing its role in dimensionality reduction and optimization of models. It also explains the Perceptron, a binary classification neural network introduced by Frank Rosenblatt, detailing its structure, components, and functioning. Additionally, it outlines the differences between Single Layer and Multi Layer Perceptrons, highlighting their respective capabilities in handling linear and complex data patterns.

Uploaded by

saranrakshu27
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

12-05-2025

UNIT - I LECTURE - 06
What is meant by Linear Discriminant?
• Linear discriminant analysis (LDA) is an approach used in supervised
machine learning to solve multi-class classification problems.
• LDA separates multiple classes with multiple features through data
dimensionality reduction.
• This technique is important in data science as it helps optimize machine
learning models.
• One such technique is Linear Discriminant Analysis (LDA) which helps in
reducing the dimensionality of data while retaining the most significant
features for classification tasks. (supervised Learning problems)
5/12/2025 Department of CSE (SB-ET) 1

UNIT - I How does LDA work? LECTURE - 06


• LDA works by finding directions in the feature space that best separate the
classes.
• It does this by maximizing the difference between the class means while
minimizing the spread within each class.
• Learning classification algorithms such as
• Decision tree
• Random forest
• Support vector machines (SVM).

5/12/2025 Department of CSE (SB-ET) 2

1
12-05-2025

UNIT - I Example : SVM Algorithm LECTURE - 06

5/12/2025 Department of CSE (SB-ET) 3

UNIT - I Pros and Cons LECTURE - 06


Pros:
• Use simplicity and efficiency of computation
• Manage high-dimensional data
• Handle multicollinearity
Cons:
• Shared mean distributions
• Not suitable for unlabeled data

5/12/2025 Department of CSE (SB-ET) 4

2
12-05-2025

UNIT - I What is Perceptron? LECTURE - 06

• Perceptron is a type of neural network that performs binary classification that


maps input features to an output decision, usually classifying data into one of two
categories, such as 0 or 1.
• Perceptron was introduced by Frank Rosenblatt in 1957.
• He proposed a Perceptron learning rule based on the original MCP neuron.
• A Perceptron is an algorithm for supervised learning of binary classifiers.
• This algorithm enables neurons to learn and processes elements in the training set
one at a time.

5/12/2025 Department of CSE (SB-ET) 5

UNIT - I Structure of Neuron LECTURE - 06

5/12/2025 Department of CSE (SB-ET) 6

3
12-05-2025

UNIT - I How Perceptron works? LECTURE - 06


The perceptron works on these simple steps which are given below:
• In the first step, all the inputs x are multiplied with their
weights w.
• In this step, add all the increased values and call them
the Weighted sum.
• In our last step, apply the weighted sum to a correct Activation
Function.
For Example:
A Unit Step Activation Function

5/12/2025 Department of CSE (SB-ET) 7

UNIT - I Components of Perceptron LECTURE - 06


A Perceptron is composed of key components that work together to process information and make predictions.
• Input Features: The perceptron takes multiple input features, each representing a characteristic of the input data.
• Weights: Each input feature is assigned a weight that determines its influence on the output. These weights are
adjusted during training to find the optimal values.
• Summation Function: The perceptron calculates the weighted sum of its inputs, combining them with their
respective weights.
• Activation Function: The weighted sum is passed through the Heaviside step function, comparing it to a threshold
to produce a binary output (0 or 1).
• Output: The final output is determined by the activation function, often used for binary classification tasks.
• Bias: The bias term helps the perceptron make adjustments independent of the input, improving its flexibility in
learning.
• Learning Algorithm: The perceptron adjusts its weights and bias using a learning algorithm, such as the Perceptron
Learning Rule, to minimize prediction errors.

5/12/2025 Department of CSE (SB-ET) 8

4
12-05-2025

UNIT - I How Perceptron works? LECTURE - 06


• A perceptron consists of a single layer of Threshold Logic Units (TLU), with each TLU fully connected to all
input nodes.

• Activation Function :

5/12/2025 Department of CSE (SB-ET) 9

UNIT - I LECTURE - 06
Example 1 : Classifying whether a given fruit is an apple or not

5/12/2025 Department of CSE (SB-ET) 10

5
12-05-2025

UNIT - I LECTURE - 06
Example 1 : Classifying whether a given fruit is an apple or not
• Let’s take a simple example of classifying whether a given fruit is an apple or not based on two inputs:
its weight (in grams) and its color (on a scale of 0 to 1, where 1 means red). (Attributes)
• The perceptron receives these inputs, multiplies them by their weights, adds a bias, and applies the
activation function to decide whether the fruit is an apple or not.
•Input 1 (Weight): 150 grams
•Input 2 (Color): 0.9 (since the fruit is mostly red)
•Weights: [0.5, 1.0]
•Bias: 1.5
The perceptron’s weighted sum would be:
(150∗0.5)+(0.9∗1.0)+1.5=76.4(150∗0.5)+(0.9∗1.0)+1.5=76.4
Let’s assume the activation function uses a threshold of 75. Since 76.4 > 75, the perceptron classifies the
fruit as an apple (output = 1).

5/12/2025 Department of CSE (SB-ET) 11

UNIT - I Types of Perceptron LECTURE - 06

There are two types of Perceptron:


• Single Layer Perceptron
• Multi Layer Perceptron

5/12/2025 Department of CSE (SB-ET) 12

6
12-05-2025

UNIT - I Single Layer Perceptron LECTURE - 06


• Single-Layer Perceptron is a type of perceptron is limited to learning linearly separable patterns.
• It is effective for tasks where the data can be divided into distinct categories through a straight line.
• While powerful in its simplicity, it struggles with more complex problems where the relationship
between inputs and outputs is non-linear.

5/12/2025 Department of CSE (SB-ET) 13

UNIT - I Multi Layer Perceptron LECTURE - 06


• Multi-Layer Perceptron possess enhanced processing capabilities as they
consist of two or more layers, adept at handling more complex patterns and
relationships within the data. (hidden Layers)

5/12/2025 Department of CSE (SB-ET) 14

You might also like