We examine the utility of a curriculum (a means of presenting training samples in a meaningful order) in unsupervised learning of probabilistic grammars. We introduce the incremental construction hypoth-esis that explains the benefits of a curriculum in learning grammars and offers some useful insights into the design of curricula as well as learning algo-rithms. We present results of experiments with (a) carefully crafted synthetic data that provide support for our hypothesis and (b) natural language corpus that demonstrate the utility of curricula in unsuper-vised learning of probabilistic grammars.
Abstract Unsupervised learning algorithms have been derived for several statistical models of Englis...
Stochastic categorial grammars (SCGs) are introduced as a more appropriate formalism for statistical...
The problem of identifying a probabilistic context free grammar has twoaspects: the first is determi...
[18]. Section 1 provides the proofs of the theorems in Section 3 of the paper. Section 2 gives more ...
Probabilistic grammars define a set of well-formed or grammatical linguistic structures, just as all...
Recent computational research on natural language corpora has revealed that relatively simple statis...
The purpose of this paper is to define the framework within which empirical investigations of probab...
There is much debate over the degree to which language learning is governed by innate language-speci...
There is much debate over the degree to which language learning is governed by innate language-speci...
Abstract. This paper presents PCFG-BCL, an unsupervised algorithm that learns a probabilistic contex...
With the rising amount of available multilingual text data, computational linguistics faces an oppor...
Probabilistic grammars offer great flexibility in modeling discrete sequential data like natural lan...
This article provides a critical assessment of the Gradual Learning Algorithm (GLA) for probabilisti...
This paper shows how to define probability distributions over linguistically realistic syntactic str...
This chapter provides a range of conceptual and technical insights into how this project can be atte...
Abstract Unsupervised learning algorithms have been derived for several statistical models of Englis...
Stochastic categorial grammars (SCGs) are introduced as a more appropriate formalism for statistical...
The problem of identifying a probabilistic context free grammar has twoaspects: the first is determi...
[18]. Section 1 provides the proofs of the theorems in Section 3 of the paper. Section 2 gives more ...
Probabilistic grammars define a set of well-formed or grammatical linguistic structures, just as all...
Recent computational research on natural language corpora has revealed that relatively simple statis...
The purpose of this paper is to define the framework within which empirical investigations of probab...
There is much debate over the degree to which language learning is governed by innate language-speci...
There is much debate over the degree to which language learning is governed by innate language-speci...
Abstract. This paper presents PCFG-BCL, an unsupervised algorithm that learns a probabilistic contex...
With the rising amount of available multilingual text data, computational linguistics faces an oppor...
Probabilistic grammars offer great flexibility in modeling discrete sequential data like natural lan...
This article provides a critical assessment of the Gradual Learning Algorithm (GLA) for probabilisti...
This paper shows how to define probability distributions over linguistically realistic syntactic str...
This chapter provides a range of conceptual and technical insights into how this project can be atte...
Abstract Unsupervised learning algorithms have been derived for several statistical models of Englis...
Stochastic categorial grammars (SCGs) are introduced as a more appropriate formalism for statistical...
The problem of identifying a probabilistic context free grammar has twoaspects: the first is determi...