We show that under certain conditions, a language model can be trained oil the basis of a second language model. The main instance of the technique trains a finite automaton on the basis of a probabilistic context-free grammar, such that the Kullback-Leibler distance between grammar and trained automaton is provably minimal. This is a substantial generalization of an existing algorithm to train an n-gram model on the basis of a probabilistic context-free grammar.</p
The use of language is one of the defining features of human cognition. Focusing here on two key fea...
Recent computational research on natural language corpora has revealed that relatively simple statis...
Grammatical inference is a branch of computational learning theory that attacks the problem of learn...
We show that under certain conditions, a language model can be trained oil the basis of a second lan...
We show that under certain conditions, a language model can be trained oil the basis of a second lan...
Grammar-based natural language processing has reached a level where it can `understand' language to ...
This paper describes an evolutionary approach to the problem of inferring stochastic context-free gr...
This paper describes an evolutionary approach to the problem of inferring stochastic context-free gr...
International audienceThis paper focuses on a subfield of machine learning, the so- called grammatica...
International audienceThis paper focuses on a subfield of machine learning, the so- called grammatica...
Building probabilistic models of language is a central task in natural language and speech processin...
We describe a corpus-based induction algorithm for probabilistic context-free grammars. The algorith...
This paper reports progress in developing a computer model of language acquisition in the form of (1...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Grammatical inference is a branch of computational learning theory that attacks the problem of learn...
The use of language is one of the defining features of human cognition. Focusing here on two key fea...
Recent computational research on natural language corpora has revealed that relatively simple statis...
Grammatical inference is a branch of computational learning theory that attacks the problem of learn...
We show that under certain conditions, a language model can be trained oil the basis of a second lan...
We show that under certain conditions, a language model can be trained oil the basis of a second lan...
Grammar-based natural language processing has reached a level where it can `understand' language to ...
This paper describes an evolutionary approach to the problem of inferring stochastic context-free gr...
This paper describes an evolutionary approach to the problem of inferring stochastic context-free gr...
International audienceThis paper focuses on a subfield of machine learning, the so- called grammatica...
International audienceThis paper focuses on a subfield of machine learning, the so- called grammatica...
Building probabilistic models of language is a central task in natural language and speech processin...
We describe a corpus-based induction algorithm for probabilistic context-free grammars. The algorith...
This paper reports progress in developing a computer model of language acquisition in the form of (1...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Grammatical inference is a branch of computational learning theory that attacks the problem of learn...
The use of language is one of the defining features of human cognition. Focusing here on two key fea...
Recent computational research on natural language corpora has revealed that relatively simple statis...
Grammatical inference is a branch of computational learning theory that attacks the problem of learn...