0% found this document useful (0 votes)
23 views16 pages

BNN and ANN

ANN and BNN
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views16 pages

BNN and ANN

ANN and BNN
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Artificial Neural Networks (ANN)

&
Biological Neural Networks (BNN)

Bridging Biology & Artificial Intelligence

BY Sukumar
What is a Neural
Network?

Neural networks are machine


learning models that mimic the
complex functions of the human
brain.
These models consist of interconnected
nodes or neurons that process data, learn
patterns and enable tasks such as pattern
recognition and decision-making.
Types of • Perceptron
• Feedforward Neural
• Long Short-Term Memory
(LSTM)
Neural Network (FNN) • Gated Recurrent Unit
(GRU)
Networks • Radial Basis Function
Network (RBFN) • Modular Neural Network
(MNN)
• Multilayer Perceptron
(MLP) • Self-Organizing Map
(SOM)
• Convolutional Neural
Network (CNN) • Generative Adversarial
• Recurrent Neural Network Network (GAN)
(RNN) • Autoencoder
Biological Neural Networks (BNN)
A biological neural network (BNN) is a complex structure composed of neurons, synapses,
dendrites, cell bodies, and axons. These networks are fundamental to the functioning of the
nervous system, enabling the processing and transmission of information through electrical and
chemical signals.
• Soma (Cell Body) → The neuron’s core;
integrates incoming signals and generates an
output if the
• Dendrites → Branch-like structures that receive
incoming signals from other neurons.
• Axon → A long projection that carries the
electrical signal (action potential) away from the
soma.
• Synapse → The junction between two neurons
(axon terminal of one, dendrite of another);
where chemical/electrical signals are
transmitted.
• Nucleus → Inside the soma; contains genetic
material and controls neuron function.

Think of it like an electrical circuit:


dendrites = inputs, soma = processor, axon = wire, synapse = outputs.
Biological Neural Network Flow
• Dendrites (Input Reception)
Receive chemical/electrical signals from other neurons.
• Soma (Integration/Processing)
Collects and sums all incoming signals (excitatory + inhibitory).
• Threshold Check (Decision Point)
If total input > threshold → neuron fires an action potential.
If not → no signal sent.
• Axon (Signal Transmission)
Action potential travels rapidly down the axon (boosted by myelin sheath).
• Axon Terminals (Output Release)
Neurotransmitters released into the synaptic gap.
• Synapse (Signal Transfer)
Neurotransmitters bind to receptors of the next neuron’s dendrites.
• Next Neuron Activated → Process Repeats
Key Characteristics
• Complexity and Adaptability: BNNs are highly complex and adaptable systems
capable of processing information in parallel. Their plasticity allows them to learn
and adapt over time.
• Parallel Processing: Unlike artificial neural networks (ANNs), which process
information sequentially, BNNs process information in a parallel and distributed
manner.
• Fault Tolerance: BNNs are robust and fault-tolerant, meaning they can continue to
function even when some neurons are damaged.
Artificial Artificial Neural Networks (ANNs) are
computer systems designed to mimic how the

Neural
human brain processes information.

Networks Just like the brain uses neurons to process


data and make decisions, ANNs use artificial
(ANN) neurons to analyze data, identify patterns and
make predictions.
Key Components of an ANN

• Input Layer: This is where the


network receives information.
• Hidden Layers: These layers
process the data received from the
input layer. The more hidden layers
there are, the more complex
patterns the network can learn and
understand.
• Output Layer: This is where the
final decision or prediction is
made.
Working of Artificial Neural
Networks

ANNs work by learning patterns in data through a


process called training. During training, the network
adjusts itself to improve its accuracy by comparing its
predictions with the actual results.
The process of back propagation is used to adjust the
weights between neurons. When the network makes a
mistake, the weights are updated to reduce the error
and improve the next prediction.
Key Characteristics
• Adaptive Learning – they self-tune weights through exposure to data, mimicking
how humans learn from mistakes.
• Non-linearity – can capture complex, nonlinear relationships that traditional
statistical models miss.
• Parallelism – computations happen across many nodes simultaneously, echoing
brain-like distributed processing.
• Fault Tolerance – minor damage (like weight loss or noise) doesn’t break the
network; resilience is built-in.
Artificial neurons vs Biological
neurons
Artificial neurons vs Biological neurons
Aspect Biological Neurons Artificial Neurons

Input Nodes: Receive data and pass it on to the


Structure Dendrites: Receive signals from other neurons.
next layer.

Hidden Layer Nodes: Process and transform


Cell Body (Soma): Processes the signals.
the data.

Axon: Transmits processed signals to other Output Nodes: Produce the final result after
neurons. processing.

Synapses: Links between neurons that transmit Weights: Connections between neurons that
Connections
signals. control the influence of one neuron on another.

Backpropagation: Adjusts the weights based


Learning Synaptic Plasticity: Changes in synaptic
on errors in predictions to improve future
Mechanism strength based on activity over time.
performance.
Artificial neurons vs Biological neurons
Aspect Biological Neurons (BNN) Artificial Neurons (ANN)

Activation Function: Maps input to output,


Activation: Neurons fire when signals are strong
Activation deciding if the neuron should fire based on the
enough to reach a threshold.
processed data.
Synchronous, layered feedforward/backward
Processing Asynchronous, massively parallel, event-driven.
updates

Learns continuously, rewires after damage, Learns on batch data, brittle when out-of-
Adaptability
integrates emotions/context. distribution.

Data Learns from few examples (child recognizes dog


Needs thousands/millions of samples.
Requirements after seeing 2–3).

Energy Requires kilowatts+ on GPUs/TPUs for deep


~20W to run an entire brain.
Efficiency nets.
Thank You

You might also like