What is the main difference between RNN and LSTM | NLP | RNN vs LSTM


The main difference between RNN and LSTM is in terms of which one maintain information in the memory for the long period of time. Here LSTM has advantage over RNN as LSTM can handle the information in memory for the long period of time as compare to RNN. But the question is what is different in LSTM than RNN by which LSTMs are capable of maintaining long term temporal dependencies (remembering information for long period of time). Also Read RNN vs CNN

A set of gates is used to control information within memory in general, such as when it enters the memory, how long and how much information may be kept, when it begins to provide output, and when it begins to decay or be forgotten.

Also read about MLOps Machine Learning Operations

Recurrent Neural Networks RNNs

๐Ÿ‘‰ RNNs have feedback loops in the recurrent layer. This lets them maintain information in ‘memory’ over time. But, it can be difficult to train standard RNNs to solve problems that require learning long-term temporal dependencies.
๐Ÿ‘‰ This is because the gradient of the loss function decays exponentially with time (called the vanishing gradient problem).

Long Short-Term Memory LSTM

๐Ÿ‘‰ LSTM networks are a type of RNN that uses special units in addition to standard units. LSTM units include a ‘memory cell’ that can maintain information in memory for long periods of time. This memory cell lets them learn longer-term dependencies.
๐Ÿ‘‰ LSTMs deal with vanishing and exploding gradient problem by introducing new gates, such as input and forget gates, which allow for a better control over the gradient flow and enable better preservation of โ€œlong-range dependenciesโ€.

๐๐จ๐ญ๐ž: The long range dependency in RNN is resolved by increasing the number of repeating layer in LSTM.

For in depth understanding you can read the paper “๐„๐ฆ๐ฉ๐ข๐ซ๐ข๐œ๐š๐ฅ ๐„๐ฏ๐š๐ฅ๐ฎ๐š๐ญ๐ข๐จ๐ง ๐จ๐Ÿ ๐†๐š๐ญ๐ž๐ ๐‘๐ž๐œ๐ฎ๐ซ๐ซ๐ž๐ง๐ญ ๐๐ž๐ฎ๐ซ๐š๐ฅ ๐๐ž๐ญ๐ฐ๐จ๐ซ๐ค๐ฌ ๐จ๐ง ๐’๐ž๐ช๐ฎ๐ž๐ง๐œ๐ž ๐Œ๐จ๐๐ž๐ฅ๐ข๐ง๐ ”

Recommended Articles:

4 comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.