Semantic similarity detection is a fundamental task in natural language understanding. Adding topic information has been useful for previous feature-engineered semantic similarity models as well as neural models for other tasks. There is currently no standard way of combining topics with pretrained contextual representations such as BERT. We propose a novel topic-informed BERT-based architecture for pairwise semantic similarity detection and show that our model improves performance over strong neural baselines across a variety of English language datasets. We find that the addition of topics to BERT helps particularly with resolving domain-specific cases
Semantic similarity is an essential component of many Natural Language Processing applications. Howe...
Large pre-trained language models such as BERT have been the driving force behind recent improvement...
Recently, discrete latent variable models have received a surge of interest in both Natural Language...
Semantic similarity detection is a fundamental task in natural language understanding. Adding topic ...
Semantic Similarity Detection refers to a collection of binary text pair classification tasks which ...
The goal of topic detection or topic modelling is to uncover the hidden topics in a large corpus. It...
Unsupervised pretraining models have been shown to facilitate a wide range of downstream NLP applica...
Word relatedness computation is an important supporting technology for many tasks in natural languag...
Relation detection is a critical step of knowledge base question answering, which directly affects t...
Ekinci, Ekin/0000-0003-0658-592X; ilhan omurca, sevinc/0000-0003-1214-9235Topic models, such as late...
Recently, discrete latent variable models have received a surge of interest in both Natural Language...
We present the MULTISEM systems submitted to SemEval 2020 Task 3: Graded Word Similarity in Context ...
Abstract. Semantic relatedness refers to the degree to which two concepts or words are related. Huma...
Semantic textual similarity (STS) measures how semantically similar two sentences are. In the contex...
This thesis focuses on finding an end-to-end unsupervised solution to solve a two-step problem of ex...
Semantic similarity is an essential component of many Natural Language Processing applications. Howe...
Large pre-trained language models such as BERT have been the driving force behind recent improvement...
Recently, discrete latent variable models have received a surge of interest in both Natural Language...
Semantic similarity detection is a fundamental task in natural language understanding. Adding topic ...
Semantic Similarity Detection refers to a collection of binary text pair classification tasks which ...
The goal of topic detection or topic modelling is to uncover the hidden topics in a large corpus. It...
Unsupervised pretraining models have been shown to facilitate a wide range of downstream NLP applica...
Word relatedness computation is an important supporting technology for many tasks in natural languag...
Relation detection is a critical step of knowledge base question answering, which directly affects t...
Ekinci, Ekin/0000-0003-0658-592X; ilhan omurca, sevinc/0000-0003-1214-9235Topic models, such as late...
Recently, discrete latent variable models have received a surge of interest in both Natural Language...
We present the MULTISEM systems submitted to SemEval 2020 Task 3: Graded Word Similarity in Context ...
Abstract. Semantic relatedness refers to the degree to which two concepts or words are related. Huma...
Semantic textual similarity (STS) measures how semantically similar two sentences are. In the contex...
This thesis focuses on finding an end-to-end unsupervised solution to solve a two-step problem of ex...
Semantic similarity is an essential component of many Natural Language Processing applications. Howe...
Large pre-trained language models such as BERT have been the driving force behind recent improvement...
Recently, discrete latent variable models have received a surge of interest in both Natural Language...