TY - JOUR AU - AB - can be framed as the process of transducing a Meaning Representations (AMRs) and Neural graph structure into a sequence. For instance, lan- Machine Translation (NMT) with source depen- guage generation may involve realising a semantic dency information. Our approach outperforms graph into a surface form and syntactic machine strong s2s baselines in both tasks without relying translation involves transforming a tree-annotated on standard RNN encoders, in contrast with pre- source sentence to its translation. vious work. In particular, for NMT we show that Previous work in this setting rely on grammar- we avoid the need for RNNs by adding sequen- based approaches such as tree transducers (Flani- tial edges between contiguous words in the depen- gan et al., 2016) and hyperedge replacement gram- dency tree. This illustrates the generality of our Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Long Papers), pages 273–283 Melbourne, Australia, July 15 - 20, 2018. c 2018 Association for Computational Linguistics ARG0 ARG1 want-01 believe-01 boy girl Figure 1: Left: the AMR graph representing the sentence “The boy wants the girl to believe him.”. Right: Our proposed architecture using the same AMR graph as input and the surface form as TI - Graph-to-Sequence Learning using Gated Graph Neural Networks JF - Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) DO - 10.18653/v1/p18-1026 DA - 2018-01-01 UR - https://www.deepdyve.com/lp/unpaywall/graph-to-sequence-learning-using-gated-graph-neural-networks-AeGmoWq5E8 DP - DeepDyve ER -