Implementing Recurrent Neural Network from Scratch
-
Updated
May 28, 2018 - Python
Implementing Recurrent Neural Network from Scratch
Keras implementations of three language models: character-level RNN, word-level RNN and Sentence VAE (Bowman, Vilnis et al 2016).
RNN-based language models in pytorch
s-atmech is an independent Open Source, Deep Learning python library which implements attention mechanism as a RNN(Recurrent Neural Network) Layer as Encoder-Decoder system. (Supports all Models both Luong and Bhanadau).
Char RNN Language Model based on Tensorflow
BlackOut and Adaptive Softmax for language models by Chainer
Implementation of Music Generation in PyTorch
attempt at implementing "Memory Architectures in Recurrent Neural Network Language Models" as a part of the ICLR 2018 reproducibility challenge
Telugu OCR using RNN
TV Script Generation with RNN with Udacity
Code and scripts for training, testing and sampling auto-regressive recurrent language models on PyTorch with RNN, GRU and LSTM layers
RNN-LSTM model that classifies movie reviews
Harnessing NLP for Historical Text Analysis: A Case Study on the Indus Valley Civilization
Classifying whether a disaster tweet is real or not using LSTM and GloVe word embeddings
The main task of the character-level language model is to predict the next character given all previous characters in a sequence of data, i.e. generates text character by character.
A lightweight, deep learning library written in pure Python
Imageboard bot with recurrent neural network (RNN, GRU)
Add a description, image, and links to the rnn-language-model topic page so that developers can more easily learn about it.
To associate your repository with the rnn-language-model topic, visit your repo's landing page and select "manage topics."