TY - JOUR AU - AB - context2vec: Learning Generic Context Embedding with Bidirectional LSTM Oren Melamud Jacob Goldberger Ido Dagan Computer Science Dept. Faculty of Engineering Computer Science Dept. Bar-Ilan University Bar-Ilan University Bar-Ilan University melamuo@cs.biu.ac.il goldbej@eng.biu.ac.il dagan@cs.biu.ac.il Abstract To make inferences regarding a concrete target word instance, good representations of both the Context representations are central to vari- target word type and the given context are help- ous NLP tasks, such as word sense disam- ful. For example, in the sentence “ I can’t find biguation, named entity recognition, co- [April]”, we need to consider both the target word reference resolution, and many more. In April and its context “ I can’t find [ ]” to infer that this work we present a neural model for April probably refers to a person. This principle efficiently learning a generic context em- applies to various tasks, including word sense dis- bedding function from large corpora, us- ambiguation, co-reference resolution and named ing bidirectional LSTM. With a very sim- entity recognition (NER). ple application of our context represen- Like target words, contexts are commonly rep- tations, we manage to surpass or nearly resented via word embeddings. In an unsupervised reach state-of-the-art results on sentence setting, such representations TI - context2vec: Learning Generic Context Embedding with Bidirectional LSTM JF - Proceedings of The 20th SIGNLL Conference on Computational Natural Language Learning DO - 10.18653/v1/k16-1006 DA - 2016-01-01 UR - https://www.deepdyve.com/lp/unpaywall/context2vec-learning-generic-context-embedding-with-bidirectional-lstm-8Mu1ow02Na DP - DeepDyve ER -