Enhancing multiple-choice question answering with causal knowledge
Dalal, Dhairya ; Arcan, Mihael ; Buitelaar, Paul
Dalal, Dhairya
Arcan, Mihael
Buitelaar, Paul
Loading...
Repository DOI
Publication Date
2021-06-10
Type
Workshop paper
Downloads
Citation
Dalal, Dhairya, Arcan, Mihael, & Buitelaar, Paul. (2021). Enhancing multiple-choice question answering with causal knowledge. Paper presented at the Proceedings of Deep Learning Inside Out (DeeLIO): The 2nd Workshop on Knowledge Extraction and Integration for Deep Learning Architectures, Online, 10 June. doi:10.18653/v1/2021.deelio-1.8
Abstract
The task of causal question answering aims to reason about causes and effects over a provided real or hypothetical premise. Recent approaches have converged on using transformer-based language models to solve question answering tasks. However, pretrained language models often struggle when external knowledge is not present in the premise or when additional context is required to answer the question. To the best of our knowledge, no prior work has explored the efficacy of augmenting pretrained language models with external causal knowledge for multiple-choice causal question answering. In this paper, we present novel strategies for the representation of causal knowledge. Our empirical results demonstrate the efficacy of augmenting pretrained models with external causal knowledge. We show improved performance on the COPA (Choice of Plausible Alternatives) and WIQA (What If Reasoning Over Procedural Text) benchmark tasks. On the WIQA benchmark, our approach is competitive with the state-of-the-art and exceeds it within the evaluation subcategories of In-Paragraph and Out-of-Paragraph perturbations.
Publisher
Association for Computational Linguistics
Publisher DOI
10.18653/v1/2021.deelio-1.8
Rights
Attribution 4.0 International (CC BY 4.0)