Knowledge-grounded dialogue systems are challenging to build due to the lack of training data and heterogeneous knowledge sources. Existing systems perform poorly on unseen topics due to limited topics covered in the training data. In addition, heterogeneous knowledge sources make it challenging for systems to generalize to other tasks because knowledge sources in different knowledge representations require different knowledge encoders. To address these challenges, we present PLUG, a language model that homogenizes different knowledge sources to a unified knowledge representation for knowledge-grounded dialogue generation tasks. PLUG is pre-trained on a dialogue generation task conditioned on a unified essential knowledge representation. It...
Neural network models usually suffer from the challenge of incorporating commonsense knowledge into ...
The goal of building dialogue agents that can converse with humans naturally has been a long-standin...
Pre-trained language models (PrLMs) have achieved great success on a wide range of natural language ...
To alleviate the problem of structured databases' limited coverage, recent task-oriented dialogue sy...
Incorporating external knowledge into the response generation process is essential to building more ...
Pretrained language models (PLMs) based knowledge-grounded dialogue systems are prone to generate re...
With the development of pre-trained language models, remarkable success has been witnessed in dialog...
Incorporating external knowledge sources effectively in conversations is a longstanding problem in o...
As the primary means of human communication, natural language bears the functionality to bridge the ...
Humans usually have conversations by making use of prior knowledge about a topic and background info...
Recent advances in large-scale pre-training provide large models with the potential to learn knowled...
The development of trustworthy conversational information-seeking systems relies on dialogue models ...
The pre-trained conversational models still fail to capture the implicit commonsense (CS) knowledge ...
Building a universal conversational agent has been a long-standing goal of the dialogue research com...
Grounding dialogue on external knowledge and interpreting linguistic patterns in dialogue history co...
Neural network models usually suffer from the challenge of incorporating commonsense knowledge into ...
The goal of building dialogue agents that can converse with humans naturally has been a long-standin...
Pre-trained language models (PrLMs) have achieved great success on a wide range of natural language ...
To alleviate the problem of structured databases' limited coverage, recent task-oriented dialogue sy...
Incorporating external knowledge into the response generation process is essential to building more ...
Pretrained language models (PLMs) based knowledge-grounded dialogue systems are prone to generate re...
With the development of pre-trained language models, remarkable success has been witnessed in dialog...
Incorporating external knowledge sources effectively in conversations is a longstanding problem in o...
As the primary means of human communication, natural language bears the functionality to bridge the ...
Humans usually have conversations by making use of prior knowledge about a topic and background info...
Recent advances in large-scale pre-training provide large models with the potential to learn knowled...
The development of trustworthy conversational information-seeking systems relies on dialogue models ...
The pre-trained conversational models still fail to capture the implicit commonsense (CS) knowledge ...
Building a universal conversational agent has been a long-standing goal of the dialogue research com...
Grounding dialogue on external knowledge and interpreting linguistic patterns in dialogue history co...
Neural network models usually suffer from the challenge of incorporating commonsense knowledge into ...
The goal of building dialogue agents that can converse with humans naturally has been a long-standin...
Pre-trained language models (PrLMs) have achieved great success on a wide range of natural language ...