The notion of latent-variable probabilistic context-free derivation of syntactic structures is enhanced to allow heads and unrestricted discontinuities. The chosen formalization covers both constituent parsing and dependency parsing. The derivational model is accompanied by an equivalent probabilistic automaton model. By the new framework, one obtains a probability distribution over the space of all discontinuous parses. This lends itself to intrinsic evaluation in terms of perplexity, as shown in experiments.Postprin
Syntactic parsing consists in assigning syntactic trees to sentences in natural language. Syntactic ...
Inducing a grammar from text has proven to be a notoriously challenging learning task despite decade...
This paper discusses the consequences of allowing discontinuous constituents in syntactic representi...
The notion of latent-variable probabilistic context-free derivation of syntactic structures is enhan...
Statistical parsers are e ective but are typically limited to producing projective dependencies or c...
Statistical parsers are e ective but are typically limited to producing projective dependencies or c...
Statistical parsers are effective but are typically limited to producing projective dependencies or ...
Recent advances in parsing technology have made treebank parsing with discontinuous constituents p...
The development of frameworks that allow to state grammars for natural languages in a mathematically...
Natural Language is highly ambiguous, on every level. This article describes a fast broad-coverage s...
We explore the concept of hybrid grammars, which formalize and generalize a range of existing framew...
The thesis focuses on learning syntactic tree structures by generalizing over an- notated treebanks....
Statistical models for parsing natural language have recently shown considerable success in broad-co...
AbstractAfter a brief survey of Discontinuous Grammar (DG), we propose local cost functions as a gen...
This thesis considers the problem of assigning a sentence its syntactic structure, which may be disc...
Syntactic parsing consists in assigning syntactic trees to sentences in natural language. Syntactic ...
Inducing a grammar from text has proven to be a notoriously challenging learning task despite decade...
This paper discusses the consequences of allowing discontinuous constituents in syntactic representi...
The notion of latent-variable probabilistic context-free derivation of syntactic structures is enhan...
Statistical parsers are e ective but are typically limited to producing projective dependencies or c...
Statistical parsers are e ective but are typically limited to producing projective dependencies or c...
Statistical parsers are effective but are typically limited to producing projective dependencies or ...
Recent advances in parsing technology have made treebank parsing with discontinuous constituents p...
The development of frameworks that allow to state grammars for natural languages in a mathematically...
Natural Language is highly ambiguous, on every level. This article describes a fast broad-coverage s...
We explore the concept of hybrid grammars, which formalize and generalize a range of existing framew...
The thesis focuses on learning syntactic tree structures by generalizing over an- notated treebanks....
Statistical models for parsing natural language have recently shown considerable success in broad-co...
AbstractAfter a brief survey of Discontinuous Grammar (DG), we propose local cost functions as a gen...
This thesis considers the problem of assigning a sentence its syntactic structure, which may be disc...
Syntactic parsing consists in assigning syntactic trees to sentences in natural language. Syntactic ...
Inducing a grammar from text has proven to be a notoriously challenging learning task despite decade...
This paper discusses the consequences of allowing discontinuous constituents in syntactic representi...