0% found this document useful (0 votes)
7 views38 pages

Lec9-10 Neural Network

The document provides an overview of Artificial Neural Networks (ANN), detailing their structure, training methods including forward and backward propagation, and the application of activation functions like sigmoid. It includes examples of calculations for forward and backward propagation, demonstrating how weights are adjusted based on errors. Additionally, references for further learning on convolutional neural networks and machine learning applications are provided.

Uploaded by

Yeasin 7006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views38 pages

Lec9-10 Neural Network

The document provides an overview of Artificial Neural Networks (ANN), detailing their structure, training methods including forward and backward propagation, and the application of activation functions like sigmoid. It includes examples of calculations for forward and backward propagation, demonstrating how weights are adjusted based on errors. Additionally, references for further learning on convolutional neural networks and machine learning applications are provided.

Uploaded by

Yeasin 7006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

Artificial Neural Networks (ANN)

Dr. Md. Golam Rabiul Alam


Associate Professor, CSE
Artificial Neural Network (ANN / NN)
Artificial neural network is a biological neural network inspired computational and
learning model that consists of several processing elements that receive inputs
and deliver outputs based on their activation functions.
Human Brain
Neuron
Neural Network
Neural Network
Training in Neural Networks

Forward Propagation
Backward Propagation
Forward Propagation
Example

w1 h1 w7

w2
w3
h2
w4 w8
w5

h3 w9
w6

h1=x1*w1+x2*w4
h2=x1*w2+x2*w5 sum =sigmoid(h1)*w7+sigmoid(h2)*w8+sigmoid(h3)*w9
h3=x1*w3+x2*w6 calculated = sigmoid(sum)
Example
Example
h1=x1*w1+x2*w4
h2=x1*w2+x2*w5
h3=x1*w3+x2*w6

Sigmoid(1.0) = 0.73105857863
Sigmoid(1.3) = 0.78583498304
Sigmoid(0.8) = 0.68997448112
Apply activation function
Example
h1=x1*w1+x2*w4
h2=x1*w2+x2*w5
h3=x1*w3+x2*w6

Sigmoid (1.0) = 0.73105857863


Sigmoid(1.3) = 0.78583498304
Sigmoid(0.8) = 0.68997448112
Example
h1=x1*w1+x2*w4
h2=x1*w2+x2*w5
h3=x1*w3+x2*w6

Sigmoid(1.0) = 0.73105857863 sum = Sigmoid(h1)*w7+ Sigmoid(h2)*w8+ Sigmoid (h3)*w9


Sigmoid(1.3) = 0.78583498304 calculated = Sigmoid(sum)
Sigmoid(0.8) = 0.68997448112

sum =0.73 * 0.3 + 0.79 * 0.5 + 0.69 * 0.9 = 1.235

calculated = Sigmoid (1.235) = 0.7746924929149283


Example

Sigmoid(1.0) = 0.73105857863 sum = Sigmoid(h1)*w7+ Sigmoid(h2)*w8+ Sigmoid (h3)*w9


Sigmoid(1.3) = 0.78583498304 calculated = Sigmoid(sum)
Sigmoid(0.8) = 0.68997448112

sum =0.73 * 0.3 + 0.79 * 0.5 + 0.69 * 0.9 = 1.235

calculated = Sigmoid (1.235) = 0.7746924929149283


Backward Propagation
Example

Target = 0
Calculated = 0.77
Target - calculated = -0.77
Example
The derivative of sigmoid, also known
as sigmoid prime, will give us the rate
of change (or “slope”) of the activation
function at the output sum:
sum =sigmoid(h1)*w7+sigmoid(h2)*w8+sigmoid(h3)*w9
calculated = sigmoid(sum)
Example sum =s(h1)*w7+s(h2)*w8+s(h3)*w9
calculated = s(sum)

Delta output sum = S'(sum) * (output sum margin of error)


Delta output sum = S'(1.235) * (-0.77)
Delta output sum = -0.13439890643886018

Delta weights = delta output sum / hidden layer results


Delta weights = -0.1344 / [0.73105, 0.78583, 0.69997]
Delta weights = [-0.1838, -0.1710, -0.1920]
Example

old w7 = 0.3
old w8 = 0.5
old w9 = 0.9

wnew=old weights wold + Delta weights

new w7 = 0.1162
new w8 = 0.329
new w9 = 0.708
Example
Example
 Delta hidden sum = delta output sum / hidden-to-outer
weights * S'(hidden sum)
 Delta hidden sum = -0.1344 / [0.3, 0.5, 0.9] * S'([1, 1.3,
0.8])
 Delta hidden sum = [-0.448, -0.2688, -0.1493] * [0.1966,
0.1683, 0.2139]
 Delta hidden sum = [-0.088, -0.0452, -0.0319]
Example
Example
 input 1 = 1
 input 2 = 1

 Delta weights = delta hidden sum / input data
 Delta weights = [-0.088, -0.0452, -0.0319] / [1, 1]
 Delta weights = [-0.088, -0.0452, -0.0319, -0.088, -0.0452, -0.0319]

wnew=old weights wold + Delta weights


Example

wnew=old weights wold + Delta weights


Example
ANN with bias
Step-by-step ANN
Step-by-step ANN
Step-by-step ANN
Step-by-step ANN
Step-by-step ANN
Step-by-step ANN
Step-by-step ANN
Step-by-step ANN
Step-by-step ANN
Step-by-step ANN
References
https://adeshpande3.github.io/adeshpande3.github.io/A-Beginner%27s-Guide-To-Understanding-Convolutional-Neural-Networks/

https://youtu.be/GlcnxUlrtek

https://pythonprogramming.net/cnn-tensorflow-convolutional-nerual-network-machine-learning-tutorial/

https://github.com/walsvid/GoogLeNet-TensorFlow

https://github.com/taki0112/ResNet-Tensorflow

https://github.com/huyng/tensorflow-vgg

https://github.com/kratzert/finetune_alexnet_with_tensorflow

https://dialogflow.com/

https://www.youtube.com/watch?v=a8JwTqByefU

https://www.simplilearn.com/incredible-machine-learning-applications-article

https://www.youtube.com/watch?v=hPKJBXkyTKM&vl=en

You might also like