This is the GitHub repo for our paper "P2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation" by Qipeng Guo, Zhijing Jin, Ning Dai, Xipeng Qiu, Xiangyang Xue, David Wipf, and Zheng Zhang.
Our model achieves the top #1 performance at the English track of the WebNLG 2020 Challenge at INLG 2020 Workshop.
Our P2 model consists of two steps:
- Planner by relational graph convolutional networks (Zhao et al, 2020)
- Pretrained Seq2Seq model: T5 (Raffel et al., 2020)
Run the run.sh for the training and the fix_nonenglish.py is a post-process script to map the character back to the original non-english one.
Our model output on WebNLG 2020 test set is available at output.txt.
If you have any question, please feel free to email the first author, Qipeng Guo, by qpguo16@fudan.edu.cn.
@article{guo2020p2,
title={P2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation},
author={Qipeng Guo, Zhijing Jin, Ning Dai, Xipeng Qiu, Xiangyang Xue, David Wipf, and Zheng Zhang},
year={2017}
}