Natural language systems trained on labeled data from one domain do not perform well on other domains. Most adaptation algorithms proposed in the literature train a new model for the new domain using unlabeled data. How-ever, it is time consuming to retrain big mod-els or pipeline systems. Moreover, the domain of a new target sentence may not be known, and one may not have significant amount of unlabeled data for every new domain. To pursue the goal of an Open Domain NLP (train once, test anywhere), we propose ADUT (ADaptation Using label-preserving Transfor-mation), an approach that avoids the need for retraining and does not require knowledge of the new domain, or any data from it. Our ap-proach applies simple label-preserving trans-forma...
Recent advances in NLP are brought by a range of large-scale pretrained language models (PLMs). Thes...
Domain adaptation for machine translation (MT) can be achieved by selecting training instances close...
In this thesis we investigate methods for deploying machine translation (MT) in real-world applicati...
The performance of a machine learning model trained on labeled data of a (source) domain degrades se...
With the fast growth of the amount of digitalized texts in recent years, text information management...
In Machine Learning, a good model is one that generalizes from training data and makes accurate pred...
The identication and classication of some circumstance semantic roles like Location, Time, Manner an...
Large-scale annotated corpora are a prerequisite to developing high-performance NLP systems. Such co...
© 2014 IEEE. We propose a method for adapting Semantic Role Labeling (SRL) systems from a source dom...
Natural language processing (NLP) algorithms are rapidly improving but often struggle when applied t...
Adaptation of models to a new domain is important for many natural language tasks. Because without a...
This paper introduces a selection-based LM using topic modeling for the purpose of domain adaptation...
Domain adaptation for machine translation (MT) can be achieved by selecting training instances close...
This paper studies the use of language models as a source of synthetic unlabeled text for NLP. We fo...
Prompt-based learning, with its capability to tackle zero-shot and few-shot NLP tasks, has gained mu...
Recent advances in NLP are brought by a range of large-scale pretrained language models (PLMs). Thes...
Domain adaptation for machine translation (MT) can be achieved by selecting training instances close...
In this thesis we investigate methods for deploying machine translation (MT) in real-world applicati...
The performance of a machine learning model trained on labeled data of a (source) domain degrades se...
With the fast growth of the amount of digitalized texts in recent years, text information management...
In Machine Learning, a good model is one that generalizes from training data and makes accurate pred...
The identication and classication of some circumstance semantic roles like Location, Time, Manner an...
Large-scale annotated corpora are a prerequisite to developing high-performance NLP systems. Such co...
© 2014 IEEE. We propose a method for adapting Semantic Role Labeling (SRL) systems from a source dom...
Natural language processing (NLP) algorithms are rapidly improving but often struggle when applied t...
Adaptation of models to a new domain is important for many natural language tasks. Because without a...
This paper introduces a selection-based LM using topic modeling for the purpose of domain adaptation...
Domain adaptation for machine translation (MT) can be achieved by selecting training instances close...
This paper studies the use of language models as a source of synthetic unlabeled text for NLP. We fo...
Prompt-based learning, with its capability to tackle zero-shot and few-shot NLP tasks, has gained mu...
Recent advances in NLP are brought by a range of large-scale pretrained language models (PLMs). Thes...
Domain adaptation for machine translation (MT) can be achieved by selecting training instances close...
In this thesis we investigate methods for deploying machine translation (MT) in real-world applicati...