Notes from deeplearning.ai's sequence models course
- RNNs have "memory"
- Can read inputs one at a time and remember context through hidden layer activations
- Uni-directional RNN: take info from past to process later input
- Bi-directional RNN: take info from both past and future
- Basic RNNs suffer from vanishing gradients