In this thesis, we investigate the use of latent variables to model complex dependencies in natural languages. Traditional models, which have a fixed parameterization, often make strong independence assumptions that lead to poor performance. This problem is often addressed by incorporating additional dependencies into the model (e.g., using higher order N-grams for language modeling). These added dependencies can increase data sparsity and/or require expert knowledge, together with trial and error, in order to identify and incorporate the most important dependencies (as in lexicalized parsing models). Traditional models, when developed for a particular genre, domain, or language, are also often difficult to adapt to another. In contrast...
The use of parameters in the description of natural language syntax has to balance between the need ...
The corpus for training a parser consists of sentences of heterogeneous grammar usages. Previous par...
This thesis focuses on robust analysis of natural language semantics. A primary bottleneck for seman...
Thesis (Ph.D.)--University of Washington, 2021Multilingual modeling comes up in natural language pro...
dMetrics Current investigations in data-driven models of parsing have shifted from purely syntactic ...
Real-world problems may contain latent dependencies (i.e., hidden sub-structures) that are dicult to...
Motivated by the large number of languages (seven) and the short development time (two months) of th...
This thesis presents several studies in neural dependency parsing for typologically diverse language...
We propose a generative dependency parsing model which uses binary latent variables to induce condit...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
The use of parameters in the description of natural language syntax has to balance between the need ...
Powerful generative models, particularly in natural language modelling, are commonly trained by maxi...
Scaling existing applications and solutions to multiple human languages has traditionally proven to ...
Thesis (Master's)--University of Washington, 2014Dependency parsing is an important natural language...
Recurrent neural networks (RNNs) are exceptionally good models of distributions over natural languag...
The use of parameters in the description of natural language syntax has to balance between the need ...
The corpus for training a parser consists of sentences of heterogeneous grammar usages. Previous par...
This thesis focuses on robust analysis of natural language semantics. A primary bottleneck for seman...
Thesis (Ph.D.)--University of Washington, 2021Multilingual modeling comes up in natural language pro...
dMetrics Current investigations in data-driven models of parsing have shifted from purely syntactic ...
Real-world problems may contain latent dependencies (i.e., hidden sub-structures) that are dicult to...
Motivated by the large number of languages (seven) and the short development time (two months) of th...
This thesis presents several studies in neural dependency parsing for typologically diverse language...
We propose a generative dependency parsing model which uses binary latent variables to induce condit...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
The use of parameters in the description of natural language syntax has to balance between the need ...
Powerful generative models, particularly in natural language modelling, are commonly trained by maxi...
Scaling existing applications and solutions to multiple human languages has traditionally proven to ...
Thesis (Master's)--University of Washington, 2014Dependency parsing is an important natural language...
Recurrent neural networks (RNNs) are exceptionally good models of distributions over natural languag...
The use of parameters in the description of natural language syntax has to balance between the need ...
The corpus for training a parser consists of sentences of heterogeneous grammar usages. Previous par...
This thesis focuses on robust analysis of natural language semantics. A primary bottleneck for seman...