Skip to content

QipengGuo/P2_WebNLG2020

Repository files navigation

P2_WebNLG2020

This is the GitHub repo for our paper "P2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation" by Qipeng Guo, Zhijing Jin, Ning Dai, Xipeng Qiu, Xiangyang Xue, David Wipf, and Zheng Zhang.

Our model achieves the top #1 performance at the English track of the WebNLG 2020 Challenge at INLG 2020 Workshop.

Model Introduction

Our P2 model consists of two steps:

Codes

Run the run.sh for the training and the fix_nonenglish.py is a post-process script to map the character back to the original non-english one.

Our model output on WebNLG 2020 test set is available at output.txt.

If you have any question, please feel free to email the first author, Qipeng Guo, by [email protected].

Citation

@article{guo2020p2,
  title={P2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation},
  author={Qipeng Guo, Zhijing Jin, Ning Dai, Xipeng Qiu, Xiangyang Xue, David Wipf, and Zheng Zhang},
  year={2017}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published