We propose a system that bridges the gap between the two major approaches toward natural language processing: robust shallow text processing and domain-specific (often linguistically-based) deep understanding. We propose to use an existing linguistically motivated deep understanding system as the core and to leverage statistical techniques and external resources such as world knowledge to broaden coverage and increase robustness. We will also develop a semantic representation framework, which supports underspecification, granularity and incrementality, the critical factors of robustness in representing natural language semantics
This paper analyzes benefits and challenges (together with possible solutions) of using natural lang...
The monumental achievements of deep learning (DL) systems seem to guarantee the absolute superiority...
Natural Language Processing (NLP) is a branch of artificial intelligence that involves the design an...
Robustness is a key issue for natural language processing in general and parsing in partic-ular, and...
Recent natural language processing (NLP) techniques have accomplished high performance on benchmark ...
Natural Language Processing (NLP) sets a relation between human and computer where the elements of h...
Knowledge resources, e.g. knowledge graphs, which formally represent essential semantics and informa...
A number of issues arise when trying to scale-up natural language understanding (NLU) tools designed...
Current semantic parsers either compute shallow representations over a wide range of input, or deepe...
There is growing evidence that the classical notion of adversarial robustness originally introduced ...
Simulating human language understanding on the computer is a great challenge. A way to approach it i...
Today, language understanding systems do quite many useful things with processing natural language, ...
There is growing evidence that the classical notion of adversarial robustness originally introduced ...
With the fast-growing popularity of current large pre-trained language models (LLMs), it is necessar...
Thesis (Ph.D.)--University of Washington, 2020For machines to understand language, they must intuiti...
This paper analyzes benefits and challenges (together with possible solutions) of using natural lang...
The monumental achievements of deep learning (DL) systems seem to guarantee the absolute superiority...
Natural Language Processing (NLP) is a branch of artificial intelligence that involves the design an...
Robustness is a key issue for natural language processing in general and parsing in partic-ular, and...
Recent natural language processing (NLP) techniques have accomplished high performance on benchmark ...
Natural Language Processing (NLP) sets a relation between human and computer where the elements of h...
Knowledge resources, e.g. knowledge graphs, which formally represent essential semantics and informa...
A number of issues arise when trying to scale-up natural language understanding (NLU) tools designed...
Current semantic parsers either compute shallow representations over a wide range of input, or deepe...
There is growing evidence that the classical notion of adversarial robustness originally introduced ...
Simulating human language understanding on the computer is a great challenge. A way to approach it i...
Today, language understanding systems do quite many useful things with processing natural language, ...
There is growing evidence that the classical notion of adversarial robustness originally introduced ...
With the fast-growing popularity of current large pre-trained language models (LLMs), it is necessar...
Thesis (Ph.D.)--University of Washington, 2020For machines to understand language, they must intuiti...
This paper analyzes benefits and challenges (together with possible solutions) of using natural lang...
The monumental achievements of deep learning (DL) systems seem to guarantee the absolute superiority...
Natural Language Processing (NLP) is a branch of artificial intelligence that involves the design an...