Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/74379
Title: Joint copying and restricted generation for paraphrase
Authors: Cao, Z 
Luo, C
Li, W 
Li, S
Issue Date: 2017
Publisher: AAAI Press
Source: 31st AAAI Conference on Artificial Intelligence, AAAI 2017, 2017, p. 3152-3158 How to cite?
Abstract: Many natural language generation tasks, such as abstractive summarization and text simplification, are paraphrase-orientated. In these tasks, copying and rewriting are two main writing modes. Most previous sequence-to-sequence (Seq2Seq) models use a single decoder and neglect this fact. In this paper, we develop a novel Seq2Seq model to fuse a copying decoder and a restricted generative decoder. The copying decoder finds the position to be copied based on a typical attention model. The generative decoder produces words limited in the source-specific vocabulary. To combine the two decoders and determine the final output, we develop a predictor to predict the mode of copying or rewriting. This predictor can be guided by the actual writing mode in the training data. We conduct extensive experiments on two different paraphrase datasets. The result shows that our model outperforms the state-of-the-art approaches in terms of both informativeness and language quality. Copyright
URI: http://hdl.handle.net/10397/74379
Appears in Collections:Conference Paper

Show full item record

SCOPUSTM   
Citations

22
Citations as of May 18, 2020

Page view(s)

58
Last Week
4
Last month
Citations as of May 24, 2020

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.