Skip to content
This repository has been archived by the owner on Aug 3, 2022. It is now read-only.

Latest commit

 

History

History
11 lines (8 loc) · 482 Bytes

README.md

File metadata and controls

11 lines (8 loc) · 482 Bytes

Chinese Character-Level Language Model

Recurrent Neural Networks(LSTM, GRU, RWA) for character-level language models in Tensorflow, the task is to predict the next character given the history of previous characters in the sentence, nce-loss is used to speedup multi-class classification when vocab size is huge, dataset was web scraped from Hong Kong Apple daily

Results

Result

Similarity

Similarity

Requirements

tensorflow 1.1.0