0% found this document useful (0 votes)
28 views6 pages

Percept Ron

It's like neurologist

Uploaded by

harnaz0610
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views6 pages

Percept Ron

It's like neurologist

Uploaded by

harnaz0610
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Perceptron

A perceptron is the foundational building block of neural networks and one of the earliest models
in machine learning. A perceptron is a type of artificial neuron introduced by Frank
Rosenblatt in 1957. It performs binary classification by mapping input features to an output
decision—typically 0 or 1.

 It’s the simplest form of a neural network.


 It’s particularly effective for linearly separable data.

key components of a perceptron:

1. Inputs (x₁, x₂, ..., xₙ)


 What they are: Features/variables from your data (e.g., pixel values, temperature readings).
 Role: Raw data fed into the perceptron for processing.

2. Weights (w₁, w₂, ..., wₙ)


 What they are: Numerical values representing each input's importance.

 Role: Adjustable parameters that scale inputs during learning. Initially random, updated via
training to minimize errors.

3. Bias (b)
 What it is: A constant offset (like the y-intercept in linear equations).

 Role: Shifts the decision boundary away from the origin (0,0).

 Why needed: Without bias, the perceptron could only model lines passing through (0,0),
severely limiting its capability.

4. Summing Function (Σ)


What it does: Computes the weighted sum:
 Purpose: Combines all inputs into a single value for classification.

5. Activation Function (σ)


 What it does: Converts the weighted sum (z) into an output (typically binary).

How Components Work Together

 Inputs are multiplied by their weights.

 Products are summed with the bias to compute z.

 The activation function maps z to a final output (0 or 1).


Problem: Train a Perceptron to Implement the AND Gate

Where z=w1x1+w2x2+b
Tasks:
1. Compute the perceptron’s output for all 4 input combinations.

2. Identify which inputs (if any) are misclassified.

3. Update the weights and bias using the Perceptron Learning Rule:
Example
2: Train

perceptron OR logic function step-by-step

You might also like