Generative commonsense reasoning which aims to empower machines to generate sentences with the capacity of reasoning over a set of concepts is a critical bottleneck for text generation. Even the state-of-the-art pre-trained language generation models struggle at this task and often produce implausible and anomalous sentences. One reason is that they rarely consider incorporating the knowledge graph which can provide rich relational information among the commonsense concepts. To promote the ability of commonsense reasoning for text generation, we propose a novel knowledge graph augmented pre-trained language generation model KG-BART, which encompasses the complex relations of concepts through the knowledge graph and produces more logical and...
Commonsense question answering aims to answer questions which require background knowledge that is n...
The common practice for training commonsense models has gone from-human-to-corpus-to-machine: humans...
Conceptualization, or viewing entities and situations as instances of abstract concepts in mind and ...
Modern language models are strong at generating grammatically correct, natural lan- guage. However, ...
We present Knowledge Enhanced Multimodal BART (KM-BART), which is a Transformer-based sequence-to-se...
Commonsense reasoning is an important aspect of building robust AI systems and is receiving signific...
Intelligent systems are expected to make smart human-like decisions based on accumulated commonsense...
Generative commonsense reasoning requires machines to generate sentences describing an everyday scen...
Recent years have brought about a renewed interest in commonsense representation and reasoning in th...
Machine learning has a wide variety of applications in the field of natural language processing (NLP...
Automatic KB completion for commonsense knowledge graphs (e.g., ATOMIC and ConceptNet) poses unique ...
While commonsense knowledge acquisition and reasoning has traditionally been a core research topic i...
Knowledge graph (KG) has been fully considered in natural language generation (NLG) tasks. A KG can ...
Augmenting pre-trained language models with knowledge graphs (KGs) has achieved success on various c...
It remains an open question whether incorporating external knowledge benefits commonsense reasoning ...
Commonsense question answering aims to answer questions which require background knowledge that is n...
The common practice for training commonsense models has gone from-human-to-corpus-to-machine: humans...
Conceptualization, or viewing entities and situations as instances of abstract concepts in mind and ...
Modern language models are strong at generating grammatically correct, natural lan- guage. However, ...
We present Knowledge Enhanced Multimodal BART (KM-BART), which is a Transformer-based sequence-to-se...
Commonsense reasoning is an important aspect of building robust AI systems and is receiving signific...
Intelligent systems are expected to make smart human-like decisions based on accumulated commonsense...
Generative commonsense reasoning requires machines to generate sentences describing an everyday scen...
Recent years have brought about a renewed interest in commonsense representation and reasoning in th...
Machine learning has a wide variety of applications in the field of natural language processing (NLP...
Automatic KB completion for commonsense knowledge graphs (e.g., ATOMIC and ConceptNet) poses unique ...
While commonsense knowledge acquisition and reasoning has traditionally been a core research topic i...
Knowledge graph (KG) has been fully considered in natural language generation (NLG) tasks. A KG can ...
Augmenting pre-trained language models with knowledge graphs (KGs) has achieved success on various c...
It remains an open question whether incorporating external knowledge benefits commonsense reasoning ...
Commonsense question answering aims to answer questions which require background knowledge that is n...
The common practice for training commonsense models has gone from-human-to-corpus-to-machine: humans...
Conceptualization, or viewing entities and situations as instances of abstract concepts in mind and ...