Expressive richness in natural languages presents a significant challenge for statistical language models (LM). As multiple word sequences can represent the same underlying meaning, only modelling the observed surface word sequence can lead to poor context coverage. To handle this issue, paraphrastic LMs were previously proposed to improve the generalization of back-off n-gram LMs. Paraphrastic neural network LMs (NNLM) are investigated in this paper. Using a paraphrastic multi-level feedforward NNLM modelling both word and phrase sequences, significant error rate reductions of 1.3% absolute (8% relative) and 0.9% absolute (5.5% relative) were obtained over the baseline n-gram and NNLM systems respectively on a state-of-the-art conversation...
Learning to paraphrase supports both writing ability and reading comprehension, particularly for les...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Expressive richness in natural languages presents a significant challenge for statistical language m...
In natural languages multiple word sequences can represent the same underlying meaning. Only modelli...
Natural languages are known for their expressive richness. Many sentences can be used to represent t...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
Recently there is growing interest in using neural networks for language modeling. In contrast to th...
Virtually any modern speech recognition system relies on count-based language models. In this thesis...
In natural languages the variability in the underlying linguistic generation rules significantly alt...
For resource rich languages, recent works have shown Neural Network based Language Models (NNLMs) to...
A Language Model (LM) is a helpful component of a variety of Natural Language Processing (NLP) syste...
© 2016 IEEE. In recent years, research on language modeling for speech recognition has increasingly ...
Recurrent neural network language models (RNNLMs) have become an increasingly popular choice for spe...
Learning to paraphrase supports both writing ability and reading comprehension, particularly for les...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Expressive richness in natural languages presents a significant challenge for statistical language m...
In natural languages multiple word sequences can represent the same underlying meaning. Only modelli...
Natural languages are known for their expressive richness. Many sentences can be used to represent t...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
Recently there is growing interest in using neural networks for language modeling. In contrast to th...
Virtually any modern speech recognition system relies on count-based language models. In this thesis...
In natural languages the variability in the underlying linguistic generation rules significantly alt...
For resource rich languages, recent works have shown Neural Network based Language Models (NNLMs) to...
A Language Model (LM) is a helpful component of a variety of Natural Language Processing (NLP) syste...
© 2016 IEEE. In recent years, research on language modeling for speech recognition has increasingly ...
Recurrent neural network language models (RNNLMs) have become an increasingly popular choice for spe...
Learning to paraphrase supports both writing ability and reading comprehension, particularly for les...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Currently, N-gram models are the most common and widely used models for statistical language modelin...