The opaqueness of the multi-hop fact verification model imposes imperative requirements for explainability. One feasible way is to extract rationales, a subset of inputs, where the performance of prediction drops dramatically when being removed. Though being explainable, most rationale extraction methods for multi-hop fact verification explore the semantic information within each piece of evidence individually, while ignoring the topological information interaction among different pieces of evidence. Intuitively, a faithful rationale bears complementary information being able to extract other rationales through the multi-hop reasoning process. To tackle such disadvantages, we cast explainable multi-hop fact verification as subgraph extracti...
Artificial intelligence can be more powerful than human intelligence. Many problems are perhaps cha...
We study the challenge of learning causal reasoning over procedural text to answer "What if..." ques...
Building compositional explanations requires models to combine two or more facts that, together, des...
Multi-hop knowledge graph (KG) reasoning has been widely studied in recent years to provide interpre...
Abstract Knowledge graph (KG) fact prediction aims to complete a KG by determining the truthfulness ...
Commonsense question answering aims to answer questions which require background knowledge that is n...
© 2021 Association for Computational LinguisticsThis paper studies the bias problem of multihop ques...
We propose the end-to-end multimodal fact-checking and explanation generation, where the input is a ...
Structural data well exists in Web applications, such as social networks in social media, citation n...
Abstract Given a knowledge graph and a fact (a triple statement), fact checking is to decide whether...
Knowledge Graph Embedding algorithms learn low-dimensional vector representa- tions for facts in a K...
International audienceRelational Graph Convolutional Networks (RGCNs) are commonly applied to Knowle...
Explainable question answering for complex questions often requires combining large numbers of facts...
Multi-hop machine reading comprehension is a challenging task in natural language processing, which ...
We tackle fact-checking using Knowledge Graphs (KGs) as a source of background knowledge. Our approa...
Artificial intelligence can be more powerful than human intelligence. Many problems are perhaps cha...
We study the challenge of learning causal reasoning over procedural text to answer "What if..." ques...
Building compositional explanations requires models to combine two or more facts that, together, des...
Multi-hop knowledge graph (KG) reasoning has been widely studied in recent years to provide interpre...
Abstract Knowledge graph (KG) fact prediction aims to complete a KG by determining the truthfulness ...
Commonsense question answering aims to answer questions which require background knowledge that is n...
© 2021 Association for Computational LinguisticsThis paper studies the bias problem of multihop ques...
We propose the end-to-end multimodal fact-checking and explanation generation, where the input is a ...
Structural data well exists in Web applications, such as social networks in social media, citation n...
Abstract Given a knowledge graph and a fact (a triple statement), fact checking is to decide whether...
Knowledge Graph Embedding algorithms learn low-dimensional vector representa- tions for facts in a K...
International audienceRelational Graph Convolutional Networks (RGCNs) are commonly applied to Knowle...
Explainable question answering for complex questions often requires combining large numbers of facts...
Multi-hop machine reading comprehension is a challenging task in natural language processing, which ...
We tackle fact-checking using Knowledge Graphs (KGs) as a source of background knowledge. Our approa...
Artificial intelligence can be more powerful than human intelligence. Many problems are perhaps cha...
We study the challenge of learning causal reasoning over procedural text to answer "What if..." ques...
Building compositional explanations requires models to combine two or more facts that, together, des...