0 ratings0% found this document useful (0 votes) 51 views20 pagesRNN - Recurrent Neural Networks
RNN concepts based on machine learning
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here.
Available Formats
Download as PDF or read online on Scribd
‘94, 1108 AM Recurtert Neural Network (RNN) Tutorial Types and Examples [Updated | Simplileam
simplilearn
‘Al & Machine Learning
Video Tutorials Articles Ebooks Free Practice Tests On-demand Webinars Live Webinars
Home Resources AI& Machine Learning Deep Learning Tutorial for Beginners Recurrent Neural
Network (RNN) Tutorial: Types, Examples, LSTM and More
Recurrent Neural Network (RNN) Tutorial: Types, Examples, LSTM ani
More
Lesson 14 of 18 By Avijeet B
Last updated on Aug 11, 2022 2azies
Table of Contents
What Is a Neural Network?
What Is a Recurrent Neural Network (RNN)?
Why Recurrent Neural Networks?
How Does Recurrent Neural Networks Work?
hitpshv.simpllearn.comutorialsideep-leaming-tuloialiran
1209124722, 11.08. AM
Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simplilear
Feed-Forward Neural Networks vs Recurrent Neural Networks
View More
Neural Networks is one of the most popular machine learning algorithms and also outperforms
other algorithms in both accuracy and speed. Therefore it becomes critical to have an in-depth
understanding of what a Neural Network is, how it is made up and what its reach and limitations are.
Deep Learning Course (with TensorFlow & Keras)
Master the Deep Learning Concepts and Models
VIEW COURSE
What Is a Neural Network?
A Neural Network consists of different layers connected to each other, working on the structure and
function of a human brain, it learns from huge volumes of data and uses complex algorithms to
train a neural net.
Here is an example of how neural networks can identify a dog's breed based on their features.
The image pixels of two different breeds of dogs are fed to the input layer of the neural network.
The image pixels are then processed in the hidden layers for feature extraction,
The output layer produces the result to identify if it's a German Shepherd or a Labrador.
Such networks do not require memorizing the past output.
Several neural networks can help solve different business problems. Let's look at a few of them.
+ Feed-Forward Neural Network: Used for general Regression and Classification problems.
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran 2009124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpllearn
* Convolutional Neural Network: Used for object detection and image classification.
* Deep Belief Network: Used in healthcare sectors for cancer detection.
+ RNN: Used for speech recognition, voice recognition, time series prediction, and natural language
processing.
Read More: What is Neural Network: Overview, Applications, and Advantages
What Is a Recurrent Neural Network (RNN)?
RNN works on the principle of saving the output of a particular layer and feeding this back to the
input in order to predict the output of the layer.
Below is how you can convert a Feed-Forward Neural Network into a Recurrent Neural Network:
Input Layer Hidden Layers Output Layer Recurrent
Fig: Simple Recurrent Neural Network
Free Deep Learning for Beginners Course
Master the Basics of Deep Learning
ENROLL NOW
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran 3209124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpllearn
The nodes in different layers of the neural network are compressed to form a single layer of
recurrent neural networks. A, B, and C are the parameters of the network.
Output Layer y
A
Hidden Layers
Input Layer
A, B and C are the parameters
Fig: Fully connected Recurrent Neural Network
Here, "x" is the input layer, “h” is the hidden layer, and “y” is the output layer. A, B, and C are the
network parameters used to improve the output of the model. At any given time t, the current input is
a combination of input at x(t) and x(t-1). The output at any given time is fetched back to the network
to improve on the output.
yt) yet) yct+1)
I I ! -
hitpshwwsimpllean.comiutorialsideep-leaming-tuloialiran9124122, 1:08 AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simplearn
A A A
h¢t-1)
het
ic c co .
B 8
xt)
Fig: Fully connected Recurrent Neural Network
Now that you understand what a recurrent neural network is let's look at the different types of
recurrent neural networks.
Read More: An Ultimate Tutorial to Neural Networks
Master deep learning concepts and the TensorFlow open-source framework with the Deep
Learning Training Course. Get skilled today!
Why Recurrent Neural Networks?
RNN were created because there were a few issues in the feed-forward neural network:
* Cannot handle sequential data
* Considers only the current input
+ Cannot memorize previous inputs
The solution to these issues is the RNN. An RNN can handle sequential data, accepting the cut
input data, and previously received inputs. RNNs can memorize previous inputs due to their in._...al
memory.
hitpsiwwsimpllearn.comutorialsideep-leaming-tuloialiran 5209124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpllearn
PCP in Al and Machine Learning
In Partnership with Purdue University
EXPLORE COURSE
How Does Recurrent Neural Networks Work?
In Recurrent Neural networks, the information cycles through a loop to the middle hidden layer.
>
>
>
>
y tz) yes) oo) yest) yee2)
A
acl c c c c co.
B T
B
@ eo ®@
Fig: Working of Recurrent Neural Network
The input layer ‘x’ takes in the input to the neural network and processes it and passes it onto the
middle layer.
The middle layer ‘h’ can consist of multiple hidden layers, each with its own activation functions and
weights and biases. If you have a neural network where the various parameters of different hid*-n
layers are not affected by the previous layer, ie: the neural network does not have memory, the u
can use a recurrent neural network.
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran9124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpilear
The Recurrent Neural Network will standardize the different activation functions and weights and
biases so that each hidden layer has the same parameters. Then, instead of creating multiple hidden
layers, it will create one and loop over it as many times as required.
Feed-Forward Neural Networks vs Recurrent Neural Networks
A feed-forward neural network allows information to flow only in the forward direction, from the
input nodes, through the hidden layers, and to the output nodes. There are no cycles or loops int
network.
Below is how a simplified presentation of a feed-forward neural network looks like:
‘Simplified presentation
NY
5
Yr
Input Layer Hidden Layers
Fig: Feed-forward Neural Network
In a feed-forward neural network, the decisions are based on the current input. It doesn’t memorize.
the past data, and there's no future scope. Feed-forward neural networks are used in general
regression and classification problems.
FREE Machine Learning Certification Course
\e a Machine Learning Engineer
EXPLORE COURSE
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran 7209124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpllearn
Applications of Recurrent Neural Networks
Image Captioning
RNNs are used to caption an image by analyzing the activities present.
“A Dog catching a ball in mid air’
Time Series Prediction
Any time series problem, like predicting the prices of stocks in a particular month, can be solved
using an RN.
Natural Language Processing
Text mining and Sentiment analysis can be carried out using an RNN for Natural Language
Processing (NLP).
When it rains, look for rainbows.
When it’s dark, look for stars.
Lox omet nail
hitpsiwwu.simpllean.comiutorialsideep-learing-tuloialiran 8209124722, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpllearn
Retemencteeeiacece:
Machine Translation
Given an input in one language, RNNs can be used to translate the input into different languages as
output.
Peiueieeuiee enh
English and it is getting translate,
into Chinese, Italian, French,
German and Spanish languages
Machine Translation
Types of Recurrent Neural Networks
There are four types of Recurrent Neural Networks:
1. One to One
2. One to Many
3, Many to One
4, Many to Many
One to One RNN
This type of neural network is known as the Vanilla Neural Network. It’s used for general mac.
learning problems, which has a single input and a single output.
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran 9209124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpllearn
one to one
Single output
Single input
‘One to Many RNN-
This type of neural network has a single input and multiple outputs. An example of this is the image
caption.
one to many
Multiple outputs
Single input
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran 10209124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simplilear
Many to One RNN
Previous Next
Tutorial Playlist,
Multiple inputs
Many to Many RNN
This RNN takes a sequence of inputs and generates a sequence of outputs. Machine translation is
one of the examples.
Two Issues of Standard RNNs
1. Vanishing Gradient Problem
Recurrent Neural Networks enable you to model time-dependent and sequential data problems, such
hitpshww.simpllean.comiutorialsideep-leaming-tuloialiran 120972422, 1108 AM Recurtert Neural Network (RNN) Tutorial Types and Examples [Updated | Simplileam
as stock market prediction, machine translation, and text generation. You will find, however, RN is,
hard to train because of the gradient problem
RNNs suffer from the problem of vanishing gradients. The gradients carry information used in the
RNN, and when the gradient becomes too small, the parameter updates become insignificant. This
makes the learning of long data sequences difficult,
2. Exploding Gradient Problem
While training a neural network, if the slope tends to grow exponentially instead of decaying, this
called an Exploding Gradient. This problem arises when large error gradients accumulate, resulti
in very large updates to the neural network model weights during the training process.
Long training time, poor performance, and bad accuracy are the major issues in gradient probler
Free Course:
Introduction to Neural Network
arn the Fundamentals of Neural Network
ENROLL NOW
Gradient Problem Solutions
Now, let's discuss the most popular and efficient way to deal with gradient problems, i.e., Long
Short-Term Memory Network (LSTMs).
First, let's understand Long-Term Dependencies.
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran
12009124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpilear
‘Suppose you want to predict the last word in the text: “The clouds are in the !
The most obvious answer to this is the “sky.” We do not need any further context to predict the last
word in the above sentence.
Consider this sentence: “| have been staying in Spain for the last 10 years... can speak fluent
The word you predict will depend on the previous few words in context. Here, you need the conte
of Spain to predict the last word in the text, and the most suitable answer to this sentence is
“Spanish.” The gap between the relevant information and the point where it's needed may have
become very large. LSTMs help you solve this problem:
Backpropagation Through Time
Backpropagation through time is when we apply a Backpropagation algorithm to a Recurrent Ne
network that has time series data as its input.
In atypical RNN, one input is fed into the network at a time, and a single output is obtained. But in
backpropagation, you use the current as well as the previous inputs as input. This is called a
timestep and one timestep will consist of many time series data points entering the RNN
simultaneously.
Once the neural network has trained on a timeset and given you an output, that output is used to
calculate and accumulate the errors. After this, the network is rolled back up and weights are
recalculated and updated keeping the errors in mind.
Long Short-Term Memory Networks
LSTMs are a special kind of RNN — capable of learning long-term dependencies by remembering
information for long periods is the default behavior.
All RNN are in the form of a chain of repeating modules of a neural network. In standard RNNs “Sis:
repeating module will have a very simple structure, such as a single tanh layer.
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran 13209124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpilear
Fig: Long Short Term Memory Networks
LSTMs also have a chain-like structure, but the repeating module is a bit different structure. Instead
of having a single neural network layer, four interacting layers are communicating extraordinarily.
Workings of LSTMs in RNN
LSTMs work in a 3-step process.
Step 1: Decide How Much Past Data It Should Remember
The first step in the LSTM is to decide which information should be omitted from the cell in that
particular time step. The sigmoid function determines this. It looks at the previous state (ht-1) along
with the current input xt and computes the function.
Consider the following two sentences:
Let the output of h(t-1) be “Alice is good in Physics. John, on the other hand, is good at Chemistry.”
Let the current input at x(t) be "John plays football well. He told me yesterday over the phone that he
had served as the captain of his college football team.”
The forget gate realizes there might be a change in context after encountering the first full stop. It
compares with the current input sentence at x(t). The next sentence talks about John, so the
information on Alice is deleted. The position of the subject is vacated and assigned to John.
Step 2: Decide How Much This Unit Adds to the Current State
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran 14209124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpilear
In the second layer, there are two parts. One is the sigmoid function, and the other is the tanh
function, In the sigmoid function, it decides which values to let through (0 or 1). tanh function gives
weightage to the values which are passed, deciding their level of importance (-1 to 1).
With the current input at x(t), the input gate analyzes the important information — John plays
football, and the fact that he was the captain of his college team is important.
“He told me yesterday over the phone” is less important; hence it's forgotten. This process of ad
some new information can be done via the input gate.
‘Step 3: Decide What Part of the Current Cell State Makes It to the Output
The third step is to decide what the output will be. First, we run a sigmoid layer, which decides w
parts of the cell state make it to the output. Then, we put the cell state through tanh to push the
values to be between -1 and 1 and multiply it by the output of the sigmoid gate.
Let's consider this example to predict the next word in the sentence: “John played tremendously well
against the opponent and won for his team. For his contributions, brave _ was awarded player of
the match.”
There could be many choices for the empty space. The current input brave is an adjective, and
adjectives describe a noun. So, "John" could be the best output after brave
LSTM Use Case
Now that you understand how LSTMs work, let's do a practical implementation to predict the prices
of stocks using the “Google stock price” data
Based on the stock price data between 2012 and 2016, we will predict the stock prices of 2017.
1. Import the required libraries
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran 16209124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpilear
2. Import the training dataset
3. Perform feature scaling to transform the data
4, Create a data structure with 60-time steps and 1 output
5, Import Keras library and its packages
6. Initialize the RNN
7. Add the LSTM layers and some dropout regularization.
8. Add the output layer.
9. Compile the RNN
10. Fit the RNN to the training set
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran
16209124122, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpilear
11. Load the stock price test data for 2017
12, Get the predicted stock price for 2017
13. Visualize the results of predicted and real stock price
Looking forward to a successful career in Al and Machine learning. Enrol in our Al and ML PC
Program in collaboration with Purdue University now.
Next Step to Success
You can also enroll in the Post Graduate Program in Al and Machine Learning with Purdue University
and in collaboration with IBM, and transform yourself into an expert in deep learning techniques
using TensorFlow, the open-source software library designed to conduct machine learning and deep
neural network research. This program in Al and Machine Learning covers Python, Machine
Learning, Natural Language Processing, Speech Recognition, Advanced Deep Learning, Computer
Vision, and Reinforcement Learning. It will prepare you for one of the world's most exciting
technology frontiers.
Have any questions for us? Leave them in the comments section of this tutorial. Our experts will get
back to you on the same, as soon as possible.
About the Author
Avijeet Biswal
Avijeet is a Senior Research Analyst at Simplilearn. Passionate about Data Analytics, Machine
hitpsiwosimpllearn.comitutoriasideep-leaming-tloriaran w7e209124722, 11.08. AM Recurrent Neural Network (RNN) Tutorial: Types and Examples [Updated] | Simpilear
Learning, ana Veep Learning, Avjeet Is also interested in poimtics, cricket, ana Tootpall.
View More
Recommended Programs
Professional Certificate Program in Al and Machine
Learning Lifetime
Access
4215 Learners
Artificial Intelligence Engineer Lifetime
Access*
22496 Leamers
Deep Learning with Keras and TensorFlow
27238 Learmers
‘Lifetime access to high-quality, self-paced e-learning content.
Explore Category
Find Professional Certificate Program in Al and Machine Learning in these cities
Post Graduate Program in Al and Machine Learning, Ahmedabad | Post Graduate Program ir AI
and Machine Learning, Bangalore | Post Graduate Program in Al and Machine Learning,
hitpsiww.simpllean.comitutorialsideep-leaming-tuloialiran 1820‘9242, 1108 AM Recent Neural Network (RNN) Tuoi: Types and Examples (Upéaed | Simpl
Chandigarh | Post Graduate Program in Al and Machine Learning, Chennai | Post Graduate
Program in Aland Machine Learning, Delhi | Post Graduate Program in Al and Machine Learning,
Hyderabad | Post Graduate Program in Al and Machine Learning, Kolkata | Post Graduate
Program in Al and Machine Learning, Mumbai | Post Graduate Program in Al and Machine
Learning, Pune
Recommended Resources
Discover the Differences Machine Learr
Between Al vs. Machin... Guide: A comp
9 Comments @ Login
e@ Join the discussion
=
Los wire oon sion up wink pisaus ()
Name
SortbyBest~ Os &
BY oo erste
=
‘Thank you for the explanation. Can | please get a link to colab or the dataset at
noorulhuda2798@gmail.com
A |v + Reply + Share»
Team Simplilearn od +P Noor + 2 months ago
<= _HiNoor, we're glad to help you with your learning goals. Our career specialists will contact
you shortly.
‘| Y + Reply + Share >
Ahmad Aburoman + 9 months ago
“Thanks for these details explanation!
hitpsiwou.simpllearn.comiutoriasideep-learing-tlorialiran 192072422, 1108 AM Recurrent Neural Network (RNN) Tutorial: Types and Examples (Upeated] | Simolleam
-_ .
Can you please send the link of train & test datasets that you provided in the (LSTM Use Case)?
* | Y= Reply » Share»
@ 127" Simpltearn wos +> Atmad Aburoman - 8 months ago
Hi Ahmad, we're glad to help you with your learning goals. Please share your contact
details and our careers specialists will reach out to you shortly.
‘A | Y + Reply + Share»
@ _ Ahmad Aburoman + Team Simpliam + 8 months ago
<== my email: ahmad.aburoman@gmail.com
~ | Y = Reply » Share»
©) Bre aise osoin ayers
Very nice and clear explanation
om * .
“| © + Reply + Share»
Team Simplilearn oa 4 Engr Md. Biddut Hossain + a year ago
<= Glad you liked it, Hossain!
A |v + Reply + Share»
e CUPCAKE a year ago
nice explanation
=_
* | Y= Reply » Share »
@ Tee simplteara os + cuPcAKE
Glad you liked it!
‘A | = Reply + Share >
Subscribe @ Privacy A Do Not Soll My Data
2009 -2022- Simplilearn Solutions
PMP_PMl,PMBOK, CAPM, PaMP, PIM, ACR PBA, RMP, SP and OPM are registered marks ofthe Project Management sthue, ne
hitpshww.simpllean.comiutoriasideep-leaming-tuloialiran 20120