In this paper, we focus on multiple-choice reading comprehension which aims to answer a question given a passage and multiple candidate options. We present the hierarchical attention flow to adequately leverage candidate options to model the interactions among passages, questions and candidate options. We observe that leveraging candidate options to boost evidence gathering from the passages play a vital role in this task, which is ignored in previous works. In addition, we explicitly model the option correlations with attention mechanism to obtain better option representations, which are further fed into a bilinear layer to obtain the ranking score for each option. On a large-scale multiple-choice reading comprehension dataset (i.e. the RA...
In the natural language processing research field, many efforts have been devoted into reading compr...
We study the role of attention and working memory in choices where options are presented sequentiall...
Several prominent models of reading posit that attention is distributed to support the parallel lexi...
Multiple-choice machine reading comprehension is an important and challenging task where the machine...
Machine Reading Comprehension (MRC) for question answering (QA), which aims to answer a question giv...
Multi-choice reading comprehension is a challenging task to select an answer from a set of candidate...
Machine Reading Comprehension (MRC) with multiplechoice questions requires the machine to read given...
Comprehending unstructured text is a challenging task for machines because it involves understanding...
Machine Reading Comprehension (MRC) refers to the task that aims to read the context through the mac...
Many NLP tasks can be regarded as a selection problem from a set of options, such as classification ...
Interpretable multi-hop reading comprehension (RC) over multiple documents is a challenging problem ...
In reading comprehension, generating sentence-level distractors is a significant task, which require...
Multi-hop machine reading comprehension is a challenging task in natural language processing, which ...
We investigate the task of distractor generation for multiple choice reading comprehension questions...
We propose a machine reading comprehension model based on the compare-aggregate framework with two-s...
In the natural language processing research field, many efforts have been devoted into reading compr...
We study the role of attention and working memory in choices where options are presented sequentiall...
Several prominent models of reading posit that attention is distributed to support the parallel lexi...
Multiple-choice machine reading comprehension is an important and challenging task where the machine...
Machine Reading Comprehension (MRC) for question answering (QA), which aims to answer a question giv...
Multi-choice reading comprehension is a challenging task to select an answer from a set of candidate...
Machine Reading Comprehension (MRC) with multiplechoice questions requires the machine to read given...
Comprehending unstructured text is a challenging task for machines because it involves understanding...
Machine Reading Comprehension (MRC) refers to the task that aims to read the context through the mac...
Many NLP tasks can be regarded as a selection problem from a set of options, such as classification ...
Interpretable multi-hop reading comprehension (RC) over multiple documents is a challenging problem ...
In reading comprehension, generating sentence-level distractors is a significant task, which require...
Multi-hop machine reading comprehension is a challenging task in natural language processing, which ...
We investigate the task of distractor generation for multiple choice reading comprehension questions...
We propose a machine reading comprehension model based on the compare-aggregate framework with two-s...
In the natural language processing research field, many efforts have been devoted into reading compr...
We study the role of attention and working memory in choices where options are presented sequentiall...
Several prominent models of reading posit that attention is distributed to support the parallel lexi...