We give a new direct construction of the shift-reduce ELR(1) parsers for recursive Transition Networks (TN), which is suitable for languages specified by Extended BNF grammars (EBNF). Such parsers are characterized by their absence of conflicts, not just the classical shiftreduce and reduce-reduce types, but also a new type named convergence conflict. Such a condition is proved correct and is more general than the past proposed conditions for the shift-reduce parsing of EBNF grammars or TN’s. The corresponding parser is smaller than a classical one, without any extra bookkeeping. A constraint on TN’s is mentioned, which enables top-down deterministic ELL (1) analysis
The aim of this thesis is to improve Natural Language Dependency Parsing. We employ a linear Shift R...
this paper we present an incremental LR parsing technique. The technique is applicable to all gramma...
Tomita's Generalized LR(1) parser (GLR) algorithm for CF grammars runs in a linear time onLR(1) gram...
We give a new direct construction of the shift-reduce ELR(1) parsers for recursive Transition Networ...
Extended BNF grammars (EBNF) allow regular expressions in the right parts of their rules. They are w...
Extended BNF grammars (EBNF) allow regular expressions in the right parts of their rules. They are w...
The Tomita’s Generalized LR(1) parsing algorithm (GLR), later improved in many ways, runs in a linea...
Incremental parsers have potential advantages for applications like language modeling for machine tr...
Incremental parsers have potential advantages for applications like language modeling for machine tr...
For the deterministic context-free languages, we compare the space and time complexity of their LR(1...
Abstract A parsing method called buffered shift-reduce parsing is presented, which adds an intermedi...
This paper compares Marcus' parser, PARSIFAL with Woods' Augmented Transition Network (ATN) parser...
We present expected F-measure training for shift-reduce parsing with RNNs, which enables the learnin...
In this paper we introduce a general framework for transition-based parsing algorithms. Among the al...
The paper presents an efficiently parallel parsing algorithm for arbitrary contextfree grammars. Thi...
The aim of this thesis is to improve Natural Language Dependency Parsing. We employ a linear Shift R...
this paper we present an incremental LR parsing technique. The technique is applicable to all gramma...
Tomita's Generalized LR(1) parser (GLR) algorithm for CF grammars runs in a linear time onLR(1) gram...
We give a new direct construction of the shift-reduce ELR(1) parsers for recursive Transition Networ...
Extended BNF grammars (EBNF) allow regular expressions in the right parts of their rules. They are w...
Extended BNF grammars (EBNF) allow regular expressions in the right parts of their rules. They are w...
The Tomita’s Generalized LR(1) parsing algorithm (GLR), later improved in many ways, runs in a linea...
Incremental parsers have potential advantages for applications like language modeling for machine tr...
Incremental parsers have potential advantages for applications like language modeling for machine tr...
For the deterministic context-free languages, we compare the space and time complexity of their LR(1...
Abstract A parsing method called buffered shift-reduce parsing is presented, which adds an intermedi...
This paper compares Marcus' parser, PARSIFAL with Woods' Augmented Transition Network (ATN) parser...
We present expected F-measure training for shift-reduce parsing with RNNs, which enables the learnin...
In this paper we introduce a general framework for transition-based parsing algorithms. Among the al...
The paper presents an efficiently parallel parsing algorithm for arbitrary contextfree grammars. Thi...
The aim of this thesis is to improve Natural Language Dependency Parsing. We employ a linear Shift R...
this paper we present an incremental LR parsing technique. The technique is applicable to all gramma...
Tomita's Generalized LR(1) parser (GLR) algorithm for CF grammars runs in a linear time onLR(1) gram...