Skip to content

Pytorch implementation of Paragraph-level Neural Question Generation with Maxout Pointer and Gated Self-attention Networks

License

Notifications You must be signed in to change notification settings

seanie12/neural-question-generation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Question Generation

This is not official implementation for the paper Paragraph-level Neural Question Generation with Maxout Pointer and Gated Self-attention Networks. I implemented in Pytorch to reproduce similar result as the paper. You can find the checkpoint of pretrained model here.

Dependencies

This code is written in Python. Dependencies include

Download data and Preprocess

mkdir squad
wget http://nlp.stanford.edu/data/glove.840B.300d.zip -O ./data/glove.840B.300d.zip 
unzip ./data/glove.840B.300d.zip 
wget https://rajpurkar.github.io/SQuAD-explorer/dataset/train-v1.1.json -O ./squad/train-v1.1.json
wget https://rajpurkar.github.io/SQuAD-explorer/dataset/dev-v1.1.json -O ./squad/dev-v1.1.json
cd data
python process_data.py

Configuration

You might need to change configuration in config.py.
If you want to train, change train = True and set the gpu device in config.py

Evaluation from this repository

cd qgevalcap
python2 eval.py --out_file prediction_file --src_file src_file --tgt_file target_file

Results

BLEU_1 BLEU_2 BLEU_3 BLEU_4
45.22 29.94 22.01 16.76

About

Pytorch implementation of Paragraph-level Neural Question Generation with Maxout Pointer and Gated Self-attention Networks

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages