We examine the extent to which, in principle, linguistic graph representations can complement and improve neural language modeling. With an ensemble setup consisting of a pretrained Transformer and ground-truth graphs from one of 7 different formalisms, we find that, overall, semantic constituency structures are most useful to language modeling performance -- outpacing syntactic constituency structures as well as syntactic and semantic dependency structures. Further, effects vary greatly depending on part-of-speech class. In sum, our findings point to promising tendencies in neuro-symbolic language modeling and invite future research quantifying the design choices made by different formalisms.Comment: Accepted to NAACL 2022 (slight typesett...
Recurrent neural networks (RNNs) are exceptionally good models of distributions over natural languag...
This thesis broadens the space of rich yet practical models for structured prediction. We introduce ...
This thesis puts forward the view that a purely signal-based approach to natural language processing...
Natural Language Processing (NLP) has become one of the leading application areas in the current Art...
While symbolic and statistical approaches to natural language processing have become undeniably impr...
With the advent of deep learning, research in many areas of machine learning is converging towards t...
Ling and Marinov (L & M) have constructed an interesting symbolic alternative to current connect...
Thesis (Ph.D.)--University of Washington, 2022Natural language processing (NLP) is having a paradigm...
We investigate the extent to which modern, neural language models are susceptible to structural prim...
Acceptability judgments are no longer acceptable as the holy grail for testing the nature of linguis...
Neural networks drive the success of natural language processing. A fundamental property of language...
This thesis investigates the role of linguistically-motivated generative models of syntax and semant...
This dissertation research developed the GOLD model (Graph Of Language Distribution), a graph-struct...
The study develops a neurocomputational architecture for grammatical processing in language producti...
Historically, models of human language assume that sentences have a symbolic structure and that this...
Recurrent neural networks (RNNs) are exceptionally good models of distributions over natural languag...
This thesis broadens the space of rich yet practical models for structured prediction. We introduce ...
This thesis puts forward the view that a purely signal-based approach to natural language processing...
Natural Language Processing (NLP) has become one of the leading application areas in the current Art...
While symbolic and statistical approaches to natural language processing have become undeniably impr...
With the advent of deep learning, research in many areas of machine learning is converging towards t...
Ling and Marinov (L & M) have constructed an interesting symbolic alternative to current connect...
Thesis (Ph.D.)--University of Washington, 2022Natural language processing (NLP) is having a paradigm...
We investigate the extent to which modern, neural language models are susceptible to structural prim...
Acceptability judgments are no longer acceptable as the holy grail for testing the nature of linguis...
Neural networks drive the success of natural language processing. A fundamental property of language...
This thesis investigates the role of linguistically-motivated generative models of syntax and semant...
This dissertation research developed the GOLD model (Graph Of Language Distribution), a graph-struct...
The study develops a neurocomputational architecture for grammatical processing in language producti...
Historically, models of human language assume that sentences have a symbolic structure and that this...
Recurrent neural networks (RNNs) are exceptionally good models of distributions over natural languag...
This thesis broadens the space of rich yet practical models for structured prediction. We introduce ...
This thesis puts forward the view that a purely signal-based approach to natural language processing...