SlideShare ist ein Scribd-Unternehmen logo
1 von 13
RECURRENT NEURAL NETWORK
SYED ANNUS ALI SHAH F2016266257
RECURRENT NEURAL NETWORKS
 Recurrent Neural Network is the type of Artificial neural network which takes the output of the
previous step and feed as the input of current step. Unlike traditional Neural Networks the all the new
outputs in the network are dependent to the previously generated output. It is mostly used to generate
random sentences for example: to predict the next word of a sentence, the previous words are required
and there is a need to remember the previously generated words. And this issue is solved by the Hidden
layer of RNN. It will not be wrong to say that the Hidden state/layer is the most important feature of
RNN , which is used to memorize some info about sequence.
IMPORTANCE
 The Importance of Recurrent Neural Network is that for every instance you do not think from the
scratch again and again, You can use prior understanding or outputs to get the understanding of the
new instance. Which a Traditional ANN is unable to do.
 With RNN, Machine is able to think more likely as a Human.
APPLICATIONS
Image Captioning
Poetry Generator
Chatbot
WORKING
 The Working of RNN can be understood by the following example:
 Suppose there is a deeper network with one input layer, three hidden layers and one output layer. Then
like other neural networks, each hidden layer will have its own set of weights and biases, let’s say, for
hidden layer 1 the weights and biases are (w1, b1), (w2, b2) for second hidden layer and (w3, b3) for third
hidden layer. This means that each of these layers are independent of each other, i.e. they do not
memorize the previous outputs.
WORKING (CONT.)
Now the RNN will do the following:
 RNN converts the independent activations into dependent activations by providing the same weights and
biases to all the layers, thus reducing the complexity of increasing parameters and memorizing each
previous outputs by giving each output as input to the next hidden layer.
 Hence these three layers can be joined together such that the weights and bias of all the hidden layers is
the same, into a single recurrent layer.
FORMULAS
 To calculate Current State:
 ht = f(ht-1,xt)
 Where :
 ht -> current state
 ht-1 -> previous state
 xt -> input state
FORMULAS (CONT.)
 Formula for applying Activation function(tanh):
 Where
 Whh -> weight at recurrent neuron
 Wxh -> weight at input neuron
FORMULAS (CONT.)
 Formula for calculating output:
 Where
 Yt -> output
 Why -> weight at output layer
TRAINING
 Training through RNN
 A single time step of the input is provided to the network.
 Then calculate its current state using set of current input and the previous state.
 The current ht becomes ht-1 for the next time step.
 One can go as many time steps according to the problem and join the information from all the previous
states.
 Once all the time steps are completed the final current state is used to calculate the output.
 The output is then compared to the actual output i.e the target output and the error is generated.
 The error is then back-propagated to the network to update the weights and hence the network (RNN) is
trained.
ADVANTAGES AND DISADVANTAGES
 Advantages of Recurrent Neural Network
 An RNN remembers each and every information through time. It is useful in time series prediction only
because of the feature to remember previous inputs as well. This is called Long Short Term Memory.
 Recurrent neural network are even used with convolutional layers to extend the effective pixel
neighborhood.
 Disadvantages of Recurrent Neural Network
 Gradient vanishing and exploding problems.
 Training an RNN is a very difficult task.
 It cannot process very long sequences if using tanh or relu as an activation function.
REFRENCES.
 Medium
 https://medium.com/explore-artificial-intelligence/an-introduction-to-recurrent-neural-networks-72c97bf0912
 Wikipedia
 Https://en.wikipedia.org/wiki/Recurrent_neural_network
 Geeks for Geeks
 https://www.geeksforgeeks.org/introduction-to-recurrent-neural-network/
Any Questions ???

Weitere ähnliche Inhalte

Was ist angesagt?

Sequence Modelling with Deep Learning
Sequence Modelling with Deep LearningSequence Modelling with Deep Learning
Sequence Modelling with Deep Learning
Natasha Latysheva
 

Was ist angesagt? (20)

Deep Learning: Recurrent Neural Network (Chapter 10)
Deep Learning: Recurrent Neural Network (Chapter 10) Deep Learning: Recurrent Neural Network (Chapter 10)
Deep Learning: Recurrent Neural Network (Chapter 10)
 
Rnn and lstm
Rnn and lstmRnn and lstm
Rnn and lstm
 
Rnn & Lstm
Rnn & LstmRnn & Lstm
Rnn & Lstm
 
Convolutional Neural Networks
Convolutional Neural NetworksConvolutional Neural Networks
Convolutional Neural Networks
 
LSTM Basics
LSTM BasicsLSTM Basics
LSTM Basics
 
Attention in Deep Learning
Attention in Deep LearningAttention in Deep Learning
Attention in Deep Learning
 
Neural networks introduction
Neural networks introductionNeural networks introduction
Neural networks introduction
 
LSTM
LSTMLSTM
LSTM
 
Autoencoders
AutoencodersAutoencoders
Autoencoders
 
Lstm
LstmLstm
Lstm
 
Sequence Modelling with Deep Learning
Sequence Modelling with Deep LearningSequence Modelling with Deep Learning
Sequence Modelling with Deep Learning
 
Activation functions and Training Algorithms for Deep Neural network
Activation functions and Training Algorithms for Deep Neural networkActivation functions and Training Algorithms for Deep Neural network
Activation functions and Training Algorithms for Deep Neural network
 
Introduction to CNN
Introduction to CNNIntroduction to CNN
Introduction to CNN
 
Notes on attention mechanism
Notes on attention mechanismNotes on attention mechanism
Notes on attention mechanism
 
Deep neural networks
Deep neural networksDeep neural networks
Deep neural networks
 
rnn BASICS
rnn BASICSrnn BASICS
rnn BASICS
 
Attention Is All You Need
Attention Is All You NeedAttention Is All You Need
Attention Is All You Need
 
Introduction to Transformer Model
Introduction to Transformer ModelIntroduction to Transformer Model
Introduction to Transformer Model
 
RNN and its applications
RNN and its applicationsRNN and its applications
RNN and its applications
 
Deep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksDeep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural Networks
 

Ähnlich wie Recurrent neural network

14889574 dl ml RNN Deeplearning MMMm.ppt
14889574 dl ml RNN Deeplearning MMMm.ppt14889574 dl ml RNN Deeplearning MMMm.ppt
14889574 dl ml RNN Deeplearning MMMm.ppt
ManiMaran230751
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
ncct
 

Ähnlich wie Recurrent neural network (20)

An Introduction to Long Short-term Memory (LSTMs)
An Introduction to Long Short-term Memory (LSTMs)An Introduction to Long Short-term Memory (LSTMs)
An Introduction to Long Short-term Memory (LSTMs)
 
Introduction to deep learning
Introduction to deep learningIntroduction to deep learning
Introduction to deep learning
 
Concepts of Temporal CNN, Recurrent Neural Network, Attention
Concepts of Temporal CNN, Recurrent Neural Network, AttentionConcepts of Temporal CNN, Recurrent Neural Network, Attention
Concepts of Temporal CNN, Recurrent Neural Network, Attention
 
Applying Deep Learning Machine Translation to Language Services
Applying Deep Learning Machine Translation to Language ServicesApplying Deep Learning Machine Translation to Language Services
Applying Deep Learning Machine Translation to Language Services
 
14889574 dl ml RNN Deeplearning MMMm.ppt
14889574 dl ml RNN Deeplearning MMMm.ppt14889574 dl ml RNN Deeplearning MMMm.ppt
14889574 dl ml RNN Deeplearning MMMm.ppt
 
Recurrent Neural Networks
Recurrent Neural NetworksRecurrent Neural Networks
Recurrent Neural Networks
 
IRJET- Mental Workload Assessment using Rnn
IRJET- Mental Workload Assessment using RnnIRJET- Mental Workload Assessment using Rnn
IRJET- Mental Workload Assessment using Rnn
 
Speech Processing with deep learning
Speech Processing  with deep learningSpeech Processing  with deep learning
Speech Processing with deep learning
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
 
An In-Depth Explanation of Recurrent Neural Networks (RNNs) - InsideAIML
An In-Depth Explanation of Recurrent Neural Networks (RNNs) - InsideAIMLAn In-Depth Explanation of Recurrent Neural Networks (RNNs) - InsideAIML
An In-Depth Explanation of Recurrent Neural Networks (RNNs) - InsideAIML
 
H017376369
H017376369H017376369
H017376369
 
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...
 
DEEPLEARNING recurrent neural networs.pdf
DEEPLEARNING recurrent neural networs.pdfDEEPLEARNING recurrent neural networs.pdf
DEEPLEARNING recurrent neural networs.pdf
 
071bct537 lab4
071bct537 lab4071bct537 lab4
071bct537 lab4
 
Neural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics CourseNeural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics Course
 
Deep learning (2)
Deep learning (2)Deep learning (2)
Deep learning (2)
 
Neural networks and deep learning
Neural networks and deep learningNeural networks and deep learning
Neural networks and deep learning
 
Nn devs
Nn devsNn devs
Nn devs
 
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance lec 13
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance   lec 13Dr. Syed Muhammad Ali Tirmizi - Special topics in finance   lec 13
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance lec 13
 
Data Science - Part VIII - Artifical Neural Network
Data Science - Part VIII -  Artifical Neural NetworkData Science - Part VIII -  Artifical Neural Network
Data Science - Part VIII - Artifical Neural Network
 

Kürzlich hochgeladen

Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
KarakKing
 

Kürzlich hochgeladen (20)

Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptx
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptx
 
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxExploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Plant propagation: Sexual and Asexual propapagation.pptx
Plant propagation: Sexual and Asexual propapagation.pptxPlant propagation: Sexual and Asexual propapagation.pptx
Plant propagation: Sexual and Asexual propapagation.pptx
 

Recurrent neural network

  • 1. RECURRENT NEURAL NETWORK SYED ANNUS ALI SHAH F2016266257
  • 2. RECURRENT NEURAL NETWORKS  Recurrent Neural Network is the type of Artificial neural network which takes the output of the previous step and feed as the input of current step. Unlike traditional Neural Networks the all the new outputs in the network are dependent to the previously generated output. It is mostly used to generate random sentences for example: to predict the next word of a sentence, the previous words are required and there is a need to remember the previously generated words. And this issue is solved by the Hidden layer of RNN. It will not be wrong to say that the Hidden state/layer is the most important feature of RNN , which is used to memorize some info about sequence.
  • 3. IMPORTANCE  The Importance of Recurrent Neural Network is that for every instance you do not think from the scratch again and again, You can use prior understanding or outputs to get the understanding of the new instance. Which a Traditional ANN is unable to do.  With RNN, Machine is able to think more likely as a Human.
  • 5. WORKING  The Working of RNN can be understood by the following example:  Suppose there is a deeper network with one input layer, three hidden layers and one output layer. Then like other neural networks, each hidden layer will have its own set of weights and biases, let’s say, for hidden layer 1 the weights and biases are (w1, b1), (w2, b2) for second hidden layer and (w3, b3) for third hidden layer. This means that each of these layers are independent of each other, i.e. they do not memorize the previous outputs.
  • 6. WORKING (CONT.) Now the RNN will do the following:  RNN converts the independent activations into dependent activations by providing the same weights and biases to all the layers, thus reducing the complexity of increasing parameters and memorizing each previous outputs by giving each output as input to the next hidden layer.  Hence these three layers can be joined together such that the weights and bias of all the hidden layers is the same, into a single recurrent layer.
  • 7. FORMULAS  To calculate Current State:  ht = f(ht-1,xt)  Where :  ht -> current state  ht-1 -> previous state  xt -> input state
  • 8. FORMULAS (CONT.)  Formula for applying Activation function(tanh):  Where  Whh -> weight at recurrent neuron  Wxh -> weight at input neuron
  • 9. FORMULAS (CONT.)  Formula for calculating output:  Where  Yt -> output  Why -> weight at output layer
  • 10. TRAINING  Training through RNN  A single time step of the input is provided to the network.  Then calculate its current state using set of current input and the previous state.  The current ht becomes ht-1 for the next time step.  One can go as many time steps according to the problem and join the information from all the previous states.  Once all the time steps are completed the final current state is used to calculate the output.  The output is then compared to the actual output i.e the target output and the error is generated.  The error is then back-propagated to the network to update the weights and hence the network (RNN) is trained.
  • 11. ADVANTAGES AND DISADVANTAGES  Advantages of Recurrent Neural Network  An RNN remembers each and every information through time. It is useful in time series prediction only because of the feature to remember previous inputs as well. This is called Long Short Term Memory.  Recurrent neural network are even used with convolutional layers to extend the effective pixel neighborhood.  Disadvantages of Recurrent Neural Network  Gradient vanishing and exploding problems.  Training an RNN is a very difficult task.  It cannot process very long sequences if using tanh or relu as an activation function.
  • 12. REFRENCES.  Medium  https://medium.com/explore-artificial-intelligence/an-introduction-to-recurrent-neural-networks-72c97bf0912  Wikipedia  Https://en.wikipedia.org/wiki/Recurrent_neural_network  Geeks for Geeks  https://www.geeksforgeeks.org/introduction-to-recurrent-neural-network/