Knowledge graph (KG) has been fully considered in natural language generation (NLG) tasks. A KG can help models generate controllable text and achieve better performance. However, most existing related approaches still lack explainability and scalability in large-scale knowledge reasoning. In this work, we propose a novel CogNLG framework for KG-to-text generation tasks. Our CogNLG is implemented based on the dual-process theory in cognitive science. It consists of two systems: one system acts as the analytic system for knowledge extraction, and another is the perceptual system for text generation by using existing knowledge. During text generation, CogNLG provides a visible and explainable reasoning path. Our framework shows excellent perf...
Deep Learning advances have enabled more fluent and flexible text generation. However, while these n...
Incorporating factual knowledge into pre-trained language models (PLM) such as BERT is an emerging t...
Knowledge graph completion aims to address the problem of extending a KG with missing triples. In th...
The use of knowledge graphs (KGs) enhances the accuracy and comprehensiveness of the responses provi...
Constructing knowledge graphs (KGs) is essential for various natural language understanding tasks, s...
Constructing knowledge graphs (KGs) is essential for various natural language understanding tasks, s...
Modern language models are strong at generating grammatically correct, natural lan- guage. However, ...
In any system that uses structured knowledgegraph (KG) data as its underlying knowledge representati...
In any system that uses structured knowledgegraph (KG) data as its underlying knowledge representati...
In any system that uses structured knowledgegraph (KG) data as its underlying knowledge representati...
Generative commonsense reasoning which aims to empower machines to generate sentences with the capac...
Knowledge Graphs (KGs) often have two characteristics: heterogeneous graph structure and text-rich e...
This work is funded by the Engineering and Physical Sciences Research Council (EPSRC), under a Natio...
Deep Learning advances have enabled more fluent and flexible text generation. However, while these n...
Recently, neural language representation models pre-trained on large corpus can capture rich co-occu...
Deep Learning advances have enabled more fluent and flexible text generation. However, while these n...
Incorporating factual knowledge into pre-trained language models (PLM) such as BERT is an emerging t...
Knowledge graph completion aims to address the problem of extending a KG with missing triples. In th...
The use of knowledge graphs (KGs) enhances the accuracy and comprehensiveness of the responses provi...
Constructing knowledge graphs (KGs) is essential for various natural language understanding tasks, s...
Constructing knowledge graphs (KGs) is essential for various natural language understanding tasks, s...
Modern language models are strong at generating grammatically correct, natural lan- guage. However, ...
In any system that uses structured knowledgegraph (KG) data as its underlying knowledge representati...
In any system that uses structured knowledgegraph (KG) data as its underlying knowledge representati...
In any system that uses structured knowledgegraph (KG) data as its underlying knowledge representati...
Generative commonsense reasoning which aims to empower machines to generate sentences with the capac...
Knowledge Graphs (KGs) often have two characteristics: heterogeneous graph structure and text-rich e...
This work is funded by the Engineering and Physical Sciences Research Council (EPSRC), under a Natio...
Deep Learning advances have enabled more fluent and flexible text generation. However, while these n...
Recently, neural language representation models pre-trained on large corpus can capture rich co-occu...
Deep Learning advances have enabled more fluent and flexible text generation. However, while these n...
Incorporating factual knowledge into pre-trained language models (PLM) such as BERT is an emerging t...
Knowledge graph completion aims to address the problem of extending a KG with missing triples. In th...