TY - JOUR AU - AB - On Hallucination and Predictive Uncertainty in Conditional Language Generation Yijun Xiao, William Yang Wang University of California, Santa Barbara fyijunxiao,williamg@cs.ucsb.edu Abstract translation (NMT) (Muller ¨ et al., 2019). These studies tackle hallucinations within a specific task Despite improvements in performances on dif- and give possible explanations of why hallucina- ferent natural language generation tasks, deep tions occur. For example, Rohrbach et al. (2018) neural models are prone to hallucinating facts attributes object hallucination in image caption- that are incorrect or nonexistent. Different ing to visual misclassification and over-reliance on hypotheses are proposed and examined sepa- language priors; Nie et al. (2019) believes hallu- rately for different tasks, but no systematic ex- cination in neural surface realization comes from planations are available across these tasks. In this study, we draw connections between hal- the misalignment between meaning representations lucinations and predictive uncertainty in con- and their corresponding references in the dataset; ditional language generation. We investigate Muller ¨ et al. (2019) claims that hallucinations in their relationship in both image captioning and NMT are mainly due to domain shift. data-to-text generation and propose a simple extension to beam search to reduce hallucina- We believe that there is a common theme TI - On Hallucination and Predictive Uncertainty in Conditional Language Generation JF - Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume DO - 10.18653/v1/2021.eacl-main.236 DA - 2021-01-01 UR - https://www.deepdyve.com/lp/unpaywall/on-hallucination-and-predictive-uncertainty-in-conditional-language-876fKl1yAO DP - DeepDyve ER -