We present a novel semi-supervised approach for sequence transduction and apply it to semantic parsing. The unsupervised component is based on a generative model in which latent sentences generate the unpaired logical forms. We apply this method to a number of semantic parsing tasks focusing on domains with limited access to labelled training data and extend those datasets with synthetically generated logical forms
Computational systems that learn to transform natural-language sentences into semantic representatio...
How do we build a semantic parser in a new domain starting with zero training ex-amples? We introduc...
In this work, we investigate the problems of semantic parsing in a few-shot learning setting. In thi...
Semantic parsing is the task of mapping a natural language sentence into a complete, formal meaning ...
We present the first unsupervised approach to the problem of learning a semantic parser, using Marko...
We present a probabilistic generative model for learning semantic parsers from ambiguous supervision...
Recently, sequence-to-sequence models have achieved impressive performance on a number of semantic p...
Target meaning representations for semantic parsing tasks are often based on programming or query la...
The task of mapping natural language expressions to logical forms is referred to as semantic parsing...
We propose a novel model for parsing natural language sentences into their for-mal semantic represen...
The research novel datasets and tasks that can be used to train semantic encoder methods, as the Rev...
This paper presents a novel method of semantic parsing that maps a natural language (NL) sentence to...
Neural semantic parsers usually generate meaning representation tokens from natural language tokens ...
We present a method for training a semantic parser using only a knowledge base and an unlabeled text...
This thesis focuses on robust analysis of natural language semantics. A primary bottleneck for seman...
Computational systems that learn to transform natural-language sentences into semantic representatio...
How do we build a semantic parser in a new domain starting with zero training ex-amples? We introduc...
In this work, we investigate the problems of semantic parsing in a few-shot learning setting. In thi...
Semantic parsing is the task of mapping a natural language sentence into a complete, formal meaning ...
We present the first unsupervised approach to the problem of learning a semantic parser, using Marko...
We present a probabilistic generative model for learning semantic parsers from ambiguous supervision...
Recently, sequence-to-sequence models have achieved impressive performance on a number of semantic p...
Target meaning representations for semantic parsing tasks are often based on programming or query la...
The task of mapping natural language expressions to logical forms is referred to as semantic parsing...
We propose a novel model for parsing natural language sentences into their for-mal semantic represen...
The research novel datasets and tasks that can be used to train semantic encoder methods, as the Rev...
This paper presents a novel method of semantic parsing that maps a natural language (NL) sentence to...
Neural semantic parsers usually generate meaning representation tokens from natural language tokens ...
We present a method for training a semantic parser using only a knowledge base and an unlabeled text...
This thesis focuses on robust analysis of natural language semantics. A primary bottleneck for seman...
Computational systems that learn to transform natural-language sentences into semantic representatio...
How do we build a semantic parser in a new domain starting with zero training ex-amples? We introduc...
In this work, we investigate the problems of semantic parsing in a few-shot learning setting. In thi...