TY - JOUR AU - AB - Iterative Recursive Attention Model for Interpretable Sequence Classification Martin Tutek and Jan Snajder Text Analysis and Knowledge Engineering Lab Faculty of Electrical Engineering and Computing, University of Zagreb Unska 3, 10000 Zagreb, Croatia fmartin.tutek,jan.snajderg@fer.hr Abstract input sentence whose polarity matches that of the final decision. However, unfolding the inference Natural language processing has greatly ben- process of a model into a series of interpretable efited from the introduction of the attention steps would make the model more interpretable mechanism. However, standard attention mod- and allow one to identify its shortcomings. els are of limited interpretability for tasks that involve a series of inference steps. We de- As a step toward that goal, we propose an exten- scribe an iterative recursive attention model, sion of the iterative attention mechanism (Sordoni which constructs incremental representations et al., 2016), which we call the iterative recursive of input data through reusing results of pre- attention model (IRAM), where the result of an viously computed queries. We train our attentive query is nonlinearly transformed and then model on sentiment classification datasets and added to the set of vector representations of the in- demonstrate its capacity to identify and com- put sequence. The nonlinear transformation, TI - Iterative Recursive Attention Model for Interpretable Sequence Classification JF - Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP DO - 10.18653/v1/w18-5427 DA - 2018-01-01 UR - https://www.deepdyve.com/lp/unpaywall/iterative-recursive-attention-model-for-interpretable-sequence-DpDMNMOnxp DP - DeepDyve ER -