TY - JOUR AU - AB - Pengfei Liu, Xipeng Qiu, Jifan Chen, Xuanjing Huang Shanghai Key Laboratory of Intelligent Information Processing, Fudan University School of Computer Science, Fudan University 825 Zhangheng Road, Shanghai, China {pfliu14,xpqiu,jfchen14,xjhuang}@fudan.edu.cn Abstract and so on. These models first encode two se- quences into continuous dense vectors by sepa- Recently, there is rising interest in mod- rated neural models, and then compute the match- elling the interactions of text pair with ing score based on sentence encoding. In this deep neural networks. In this paper, we paradigm, two sentences have no interaction un- propose a model of deep fusion LSTMs til arriving final phase. (DF-LSTMs) to model the strong inter- action of text pair in a recursive match- Semi-interaction Models Another kind of mod- ing way. Specifically, DF-LSTMs con- els use soft attention mechanism to obtain the rep- sist of two interdependent LSTMs, each resentation of one sentence by depending on rep- of which models a sequence under the in- resentation of another sentence, such as ABCNN fluence of another. We also use exter- (Yin et al., 2015), Attention LSTM (Rocktaschel ¨ nal memory to increase the capacity of et al., 2015; Hermann et al., 2015). These models LSTMs, thereby possibly capturing TI - Deep Fusion LSTMs for Text Semantic Matching JF - Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) DO - 10.18653/v1/p16-1098 DA - 2016-01-01 UR - https://www.deepdyve.com/lp/unpaywall/deep-fusion-lstms-for-text-semantic-matching-Fla7ZGZaDI DP - DeepDyve ER -