Verwimp L., Pelemans J., Van hamme H., Wambacq P., ''Augmenting recurrent neural network language models with subword information'', Book of abstracts 26th meeting of computational linguistics in The Netherlands - CLIN26, pp. 12, December 18, 2015, Amsterdam, The Netherlands.status: publishe
Verwimp L., Pelemans J., Wambacq P., ''Project STON: speeding up Dutch subtitling with speech and la...
During recent years, neural networks show crucial improvement in catching semantics of words or sen...
The task of part-of-speech (POS) language modeling typically includes a very small vocabulary, which...
Verwimp L., Pelemans J., Van hamme H., Wambacq P., ''Augmenting recurrent neural network language mo...
Shi Y., Larson M., Pelemans J., Jonker C.M., Wambacq P., Wiggers P., Demuynck K., ''Integrating meta...
de Raedt L, Hammer B, Hitzler P, Maass W, eds. Recurrent Neural Networks - Models, Capacities, and A...
In this paper we present a survey on the application of recurrent neural networks to the task of sta...
The very promising reported results of Neural Networks grammar modelling has motivated a lot of rese...
Language modeling is a crucial component in a wide range of applications including speech recognitio...
This repository contains the raw results (by word information-theoretic measures for the experimenta...
Comunicació presentada a la 2016 Conference of the North American Chapter of the Association for Com...
The recurrent neural network language model (RNNLM) has been demonstrated to consistently reduce per...
Recently there has been a lot of interest in neural network based language models. These models typi...
International audienceInformationRetrieval(IR)classicallyreliesonseveralprocessestoimproveperfor- ma...
Making predictions of the following word given the back history of words may be challenging without ...
Verwimp L., Pelemans J., Wambacq P., ''Project STON: speeding up Dutch subtitling with speech and la...
During recent years, neural networks show crucial improvement in catching semantics of words or sen...
The task of part-of-speech (POS) language modeling typically includes a very small vocabulary, which...
Verwimp L., Pelemans J., Van hamme H., Wambacq P., ''Augmenting recurrent neural network language mo...
Shi Y., Larson M., Pelemans J., Jonker C.M., Wambacq P., Wiggers P., Demuynck K., ''Integrating meta...
de Raedt L, Hammer B, Hitzler P, Maass W, eds. Recurrent Neural Networks - Models, Capacities, and A...
In this paper we present a survey on the application of recurrent neural networks to the task of sta...
The very promising reported results of Neural Networks grammar modelling has motivated a lot of rese...
Language modeling is a crucial component in a wide range of applications including speech recognitio...
This repository contains the raw results (by word information-theoretic measures for the experimenta...
Comunicació presentada a la 2016 Conference of the North American Chapter of the Association for Com...
The recurrent neural network language model (RNNLM) has been demonstrated to consistently reduce per...
Recently there has been a lot of interest in neural network based language models. These models typi...
International audienceInformationRetrieval(IR)classicallyreliesonseveralprocessestoimproveperfor- ma...
Making predictions of the following word given the back history of words may be challenging without ...
Verwimp L., Pelemans J., Wambacq P., ''Project STON: speeding up Dutch subtitling with speech and la...
During recent years, neural networks show crucial improvement in catching semantics of words or sen...
The task of part-of-speech (POS) language modeling typically includes a very small vocabulary, which...