RNN

Long Short-Term Memory (LSTM)

For detailed explanation and summary see: Motivation Memory cell Inputs are “commited” into memory. Later inputs “erase” early inputs An additional memory “cell” for long term memory Also being read and write from the current step, but less affected like 𝐻 LSTM Operations Forget gate Input Gate Candidate Content Output Gate Forget Forget: remove information from cell CC

2020-08-21

Recurrent Neural Networks

For detailed explanation and summary see: RNN Summary Overview Specifically designed for long-range dependency 💡 Main idea: connecting the hidden states together within a layer Simple RNNs Elman Networks The output of the hidden layer is used as input for the next time step They use a copy mechanism.

2020-08-16

Backpropagation Through Time (BPTT)

Recurrent neural networks (RNNs) have attracted great attention on sequential tasks. However, compared to general feedforward neural networks, it is a little bit harder to train RNNs since RNNs have feedback loops.

2020-08-13

RNN Implementation

View in nbviewer: char-RNN

2020-08-13

👍 Backpropagation Through Time (BPTT)

Recurrent neural networks (RNNs) have attracted great attention on sequential tasks. However, compared to general feedforward neural networks, it is a little bit harder to train RNNs since RNNs have feedback loops.

2020-08-13

Resource

RNN Tutorials Illustrated Guide to Recurrent Neural Networks 🔥👍 Video Tutorial: Implementation min-char-rnn Application of RNN: The Unreasonable Effectiveness of Recurrent Neural Networks LSTM Tutorials Understanding LSTM Networks 🔥👍 Illustrated Guide to LSTM’s and GRU’s: A step by step explanation 🔥👍 Video Tutorial: 如何从RNN起步,一步一步通俗理解LSTM 通俗有趣地解释RNN和LSTM 🔥👍 Implementation

2020-08-03

LSTM Summary

Problem of Vanilla RNN Short-term memory If a sequence is long enough, they’ll have a hard time carrying information from earlier time steps to later ones. So if you are trying to process a paragraph of text to do predictions, RNN’s may leave out important information from the beginning.

2020-08-03

RNN Summary

Intuition Humans don’t start their thinking from scratch every second. As you read this article, you understand each word based on your understanding of previous words. You don’t throw everything away and start thinking from scratch again.

2020-08-03

RNN Resource

RNN Tutorials Illustrated Guide to Recurrent Neural Networks 🔥👍 Video Tutorial: Implementation min-char-rnn Application of RNN: The Unreasonable Effectiveness of Recurrent Neural Networks LSTM Tutorials Understanding LSTM Networks 🔥👍 Illustrated Guide to LSTM’s and GRU’s: A step by step explanation 🔥👍 Video Tutorial: 如何从RNN起步,一步一步通俗理解LSTM

2020-08-03

👍 LSTM Summary

Problem of Vanilla RNN Short-term memory If a sequence is long enough, they’ll have a hard time carrying information from earlier time steps to later ones. So if you are trying to process a paragraph of text to do predictions, RNN’s may leave out important information from the beginning.

2020-08-03