Zero-shot slot filling has received considerable attention to cope with the problem of limited available data for the target domain. One of the important factors in zero-shot learning is to make the model learn generalized and reliable representations. For this purpose, we present mcBERT, which stands for momentum contrastive learning with BERT, to develop a robust zero-shot slot filling model. mcBERT uses BERT to initialize the two encoders, the query encoder and key encoder, and is trained by applying momentum contrastive learning. Our experimental results on the SNIPS benchmark show that mcBERT substantially outperforms the previous models, recording a new state-of-the-art. Besides, we also show that each component composing mcBERT contr...
Recent advances in End-to-End (E2E) Spoken Language Understanding (SLU) have been primarily due to e...
Nowadays, owing to the superior capacity of the large pre-trained language models (PLM), the PLM-bas...
CLIP yielded impressive results on zero-shot transfer learning tasks and is considered as a foundati...
We propose PromptBERT, a novel contrastive learning method for learning better sentence representati...
Few-shot learning (FSL) aims to recognize target classes by adapting the prior knowledge learned fro...
Zero-shot cross-domain slot filling aims to transfer knowledge from the labeled source domain to the...
Zero-shot learning (ZSL) aims to predict unseen classes whose samples have never appeared during tra...
Speech is the surface form of a finite set of phonetic units, which can be represented by discrete c...
We present federated momentum contrastive clustering (FedMCC), a learning framework that can not onl...
A two-stage training paradigm consisting of sequential pre-training and meta-training stages has bee...
Zero-shot learning has received increasing interest as a means to alleviate the of-ten prohibitive e...
Pretrained language models (PLMs) have demonstrated remarkable performance in various natural langua...
We propose a multitask pretraining approach ZeroPrompt for zero-shot generalization, focusing on tas...
Though achieving impressive results on many NLP tasks, the BERT-like masked language models (MLM) en...
Zero-Shot Learning (ZSL) is related to training machine learning models capable of classifying or pr...
Recent advances in End-to-End (E2E) Spoken Language Understanding (SLU) have been primarily due to e...
Nowadays, owing to the superior capacity of the large pre-trained language models (PLM), the PLM-bas...
CLIP yielded impressive results on zero-shot transfer learning tasks and is considered as a foundati...
We propose PromptBERT, a novel contrastive learning method for learning better sentence representati...
Few-shot learning (FSL) aims to recognize target classes by adapting the prior knowledge learned fro...
Zero-shot cross-domain slot filling aims to transfer knowledge from the labeled source domain to the...
Zero-shot learning (ZSL) aims to predict unseen classes whose samples have never appeared during tra...
Speech is the surface form of a finite set of phonetic units, which can be represented by discrete c...
We present federated momentum contrastive clustering (FedMCC), a learning framework that can not onl...
A two-stage training paradigm consisting of sequential pre-training and meta-training stages has bee...
Zero-shot learning has received increasing interest as a means to alleviate the of-ten prohibitive e...
Pretrained language models (PLMs) have demonstrated remarkable performance in various natural langua...
We propose a multitask pretraining approach ZeroPrompt for zero-shot generalization, focusing on tas...
Though achieving impressive results on many NLP tasks, the BERT-like masked language models (MLM) en...
Zero-Shot Learning (ZSL) is related to training machine learning models capable of classifying or pr...
Recent advances in End-to-End (E2E) Spoken Language Understanding (SLU) have been primarily due to e...
Nowadays, owing to the superior capacity of the large pre-trained language models (PLM), the PLM-bas...
CLIP yielded impressive results on zero-shot transfer learning tasks and is considered as a foundati...