Solving Vansihing grad problem due to LSTM

I have understood the drawbacks of RNN are vanishing and exploding gradients. But can someone please summarize me how LSTM helps to tackle this problem. There is no video in course giving more info about it.

Hey @preetishvij, your question needs huge explanation, actually in lstm the forget gate plays the major part in this.
For detailed explanation you need to go into the mathematical part, you can find this link https://mc.ai/how-do-lstm-networks-solve-the-problem-of-vanishing-gradients/
Helpful for the purpose. You can go through it.

Hope this resolved your doubt.
Plz mark the doubt as resolved in my doubts section. :grinning:

1 Like

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.

1 Like