10 pages ; EMNLP'2007 Conference (Prague)International audienceMost current word prediction systems make use of n-gram language models (LM) to estimate the probability of the following word in a phrase. In the past years there have been many attempts to enrich such language models with further syntactic or semantic information. We want to explore the predictive powers of Latent Semantic Analysis (LSA), a method that has been shown to provide reliable information on long-distance semantic dependencies between words in a context. We present and evaluate here several methods that integrate LSA-based information with a standard language model: a semantic cache, partial reranking, and different forms of interpolation. We found that all methods s...
We propose the first probabilistic approach to modeling cross-lingual semantic similarity (CLSS) in ...
We investigate the use of topic models, such as probabilistic latent semantic analysis (PLSA) and la...
Recent works on word representations mostly rely on predictive models. Distributed word representati...
We describe an extension to the use of Latent Semantic Analysis (LSA) for language modeling. This te...
We describe an extension to the use of Latent Semantic Analysis (LSA) for language modeling. This te...
Statistical language models used in large-vocabulary speech recognition must properly encapsulate th...
Natural Language Processing (NLP) is a sub-field of Artificial Intelligence (AI) that allows machine...
Natural-language based knowledge representations borrow their expressiveness from the semantics of l...
This paper introduces a collection of freely available Latent Semantic Analysis models built on the ...
Recent developments in distributional semantics (Mikolov, Chen, Corrado, & Dean, 2013; Mikolov, Suts...
To capture local and global constraints in a language, statistical n-grams are used in combination ...
Over the past two decades, researchers have made great advances in the area of computational methods...
Current word completion tools rely mostly on statistical or syntactic knowledge. Can using semantic ...
In this paper, we compare a well-known semantic spacemodel, Latent Semantic Analysis (LSA) with anot...
In this paper, we compare a well-known semantic spacemodel, Latent Semantic Analysis (LSA) with anot...
We propose the first probabilistic approach to modeling cross-lingual semantic similarity (CLSS) in ...
We investigate the use of topic models, such as probabilistic latent semantic analysis (PLSA) and la...
Recent works on word representations mostly rely on predictive models. Distributed word representati...
We describe an extension to the use of Latent Semantic Analysis (LSA) for language modeling. This te...
We describe an extension to the use of Latent Semantic Analysis (LSA) for language modeling. This te...
Statistical language models used in large-vocabulary speech recognition must properly encapsulate th...
Natural Language Processing (NLP) is a sub-field of Artificial Intelligence (AI) that allows machine...
Natural-language based knowledge representations borrow their expressiveness from the semantics of l...
This paper introduces a collection of freely available Latent Semantic Analysis models built on the ...
Recent developments in distributional semantics (Mikolov, Chen, Corrado, & Dean, 2013; Mikolov, Suts...
To capture local and global constraints in a language, statistical n-grams are used in combination ...
Over the past two decades, researchers have made great advances in the area of computational methods...
Current word completion tools rely mostly on statistical or syntactic knowledge. Can using semantic ...
In this paper, we compare a well-known semantic spacemodel, Latent Semantic Analysis (LSA) with anot...
In this paper, we compare a well-known semantic spacemodel, Latent Semantic Analysis (LSA) with anot...
We propose the first probabilistic approach to modeling cross-lingual semantic similarity (CLSS) in ...
We investigate the use of topic models, such as probabilistic latent semantic analysis (PLSA) and la...
Recent works on word representations mostly rely on predictive models. Distributed word representati...