This paper focuses on a system, Wolfie (WOrd Learning From Interpreted Examples), that acquires a semantic lexicon from a corpus of sentences paired with semantic representations. The lexicon learned consists of phrases paired with meaning representations. Wolfie is part of an integrated system that learns to parse representations such as logical database queries. Experimental results are presented demonstrating Wolfie’s ability to learn useful lexicons for a database interface in four different natural languages. The usefulness of the lexicons learned by Wolfie are compared to those acquired by a similar system developed by Siskind (1996), with results favorable to Wolfie. A second set of experiments demonstrates Wolfie’s ability to scale ...
In natural language acquisition, it is difficult to gather the annotated data needed for supervised ...
Most natural language processing tasks require lexical semantic information. Automated acquisition o...
Self-supervised pre-training techniques, albeit relying on large amounts of text, have enabled rapid...
This paper focuses on a system, Wolfie (WOrd Learning From Interpreted Examples), that acquires a se...
This paper describes a system, Wolfie (WOrd Learning From Interpreted Examples), that acquires a sem...
This paper describes a system, Wolfie (WOrd Learning From Interpreted Examples), that acquires a sem...
Building accurate and efficient natural language processing (NLP) systems is an important and diffic...
Semantic parsing is the task of mapping a natural language sentence into a complete, formal meaning ...
Natural language processing will not be able to compete with traditional information retrieval unles...
This paper describes an approach to alleviating the well-known problem of the knowledge acquisition ...
Computational systems that learn to transform natural-language sentences into semantic representatio...
The representation of written language semantics is a central problem of language technology and a c...
Semantic knowledge can be a great asset to natural language processing systems, but it is usually ha...
We introduce a learning semantic parser, SCISSOR, that maps natural-language sentences to a detailed...
We present MaJo, a toolkit for supervised Word Sense Disambiguation (WSD), with an interface for Act...
In natural language acquisition, it is difficult to gather the annotated data needed for supervised ...
Most natural language processing tasks require lexical semantic information. Automated acquisition o...
Self-supervised pre-training techniques, albeit relying on large amounts of text, have enabled rapid...
This paper focuses on a system, Wolfie (WOrd Learning From Interpreted Examples), that acquires a se...
This paper describes a system, Wolfie (WOrd Learning From Interpreted Examples), that acquires a sem...
This paper describes a system, Wolfie (WOrd Learning From Interpreted Examples), that acquires a sem...
Building accurate and efficient natural language processing (NLP) systems is an important and diffic...
Semantic parsing is the task of mapping a natural language sentence into a complete, formal meaning ...
Natural language processing will not be able to compete with traditional information retrieval unles...
This paper describes an approach to alleviating the well-known problem of the knowledge acquisition ...
Computational systems that learn to transform natural-language sentences into semantic representatio...
The representation of written language semantics is a central problem of language technology and a c...
Semantic knowledge can be a great asset to natural language processing systems, but it is usually ha...
We introduce a learning semantic parser, SCISSOR, that maps natural-language sentences to a detailed...
We present MaJo, a toolkit for supervised Word Sense Disambiguation (WSD), with an interface for Act...
In natural language acquisition, it is difficult to gather the annotated data needed for supervised ...
Most natural language processing tasks require lexical semantic information. Automated acquisition o...
Self-supervised pre-training techniques, albeit relying on large amounts of text, have enabled rapid...