TY - GEN
T1 - Graph-based neural multi-document summarization
AU - Yasunaga, Michihiro
AU - Zhang, Rui
AU - Meelu, Kshitijh
AU - Pareek, Ayush
AU - Srinivasan, Krishnan
AU - Radev, Dragomir
N1 - Funding Information:
We would like to thank the members of the Sapphire Project (University of Michigan and IBM), as well as all the anonymous reviewers for their helpful suggestions on this work. This material is based in part upon work supported by IBM under contract 4915012629. Any opinions, findings, conclusions, or recommendations expressed herein are those of the authors and do not necessarily reflect the views of IBM.
PY - 2017
Y1 - 2017
N2 - We propose a neural multi-document summarization (MDS) system that incorporates sentence relation graphs. We employ a Graph Convolutional Network (GCN) on the relation graphs, with sentence embeddings obtained from Recurrent Neural Networks as input node features. Through multiple layer-wise propagation, the GCN generates high-level hidden sentence features for salience estimation. We then use a greedy heuristic to extract salient sentences while avoiding redundancy. In our experiments on DUC 2004, we consider three types of sentence relation graphs and demonstrate the advantage of combining sentence relations in graphs with the representation power of deep neural networks. Our model improves upon traditional graph-based extractive approaches and the vanilla GRU sequence model with no graph, and it achieves competitive results against other state-of-the-art multidocument summarization systems.
AB - We propose a neural multi-document summarization (MDS) system that incorporates sentence relation graphs. We employ a Graph Convolutional Network (GCN) on the relation graphs, with sentence embeddings obtained from Recurrent Neural Networks as input node features. Through multiple layer-wise propagation, the GCN generates high-level hidden sentence features for salience estimation. We then use a greedy heuristic to extract salient sentences while avoiding redundancy. In our experiments on DUC 2004, we consider three types of sentence relation graphs and demonstrate the advantage of combining sentence relations in graphs with the representation power of deep neural networks. Our model improves upon traditional graph-based extractive approaches and the vanilla GRU sequence model with no graph, and it achieves competitive results against other state-of-the-art multidocument summarization systems.
UR - http://www.scopus.com/inward/record.url?scp=85055606117&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85055606117&partnerID=8YFLogxK
U2 - 10.18653/v1/k17-1045
DO - 10.18653/v1/k17-1045
M3 - Conference contribution
AN - SCOPUS:85055606117
T3 - CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings
SP - 452
EP - 462
BT - CoNLL 2017 - 21st Conference on Computational Natural Language Learning, Proceedings
PB - Association for Computational Linguistics (ACL)
T2 - 21st Conference on Computational Natural Language Learning, CoNLL 2017
Y2 - 3 August 2017 through 4 August 2017
ER -