Stanford CS224N (Natural Language Processing)
The general structure of these assignments involved writing full or partial Python functions and subclasses with PyTorch. Topics covered included word and character embeddings, GloVe, Word2Vec, machine translation with Long Short Term Memory (LSTMs), and attention mechanisms.
- A1 - Exploring properties of word embeddings (e.g., analogies)
- A2 - Word2Vec implementation for building word embeddings
- A3 - Neural network-based dependency parser
- A4 - LSTM-based machine translation model with attention (English -> Spanish)
- A5 - A4 model plus subword convolutional modeling for unknown words