We investigate how neural networks can learn and process languages with hierarchical, compositional semantics. To this end, we define the artificial task of processing nested arithmetic expressions, and study whether different types of neural networks can learn to compute their meaning. We find that recursive neural networks can find a generalising solution to this problem, and we visualise this solution by breaking it up in three steps: project, sum and squash. As a next step, we investigate recurrent neural networks, and show that a gated recurrent unit, that processes its input incrementally, also performs very well on this task. To develop an understanding of what the recurrent network encodes, visualisation techniques alone do not suff...
We present a biologically inspired computational framework for language processing and grammar acqui...
We examine two connectionist networks—a fractal learning neural network (FLNN) and a Sim-ple Recurre...
Time series often have a temporal hierarchy, with information that is spread out over multiple time ...
We investigate how neural networks can learn and process languages with hierarchical, compositional ...
We investigate how neural networks can be used for hierarchical, compositional semantics. To this en...
Artificial neural networks have become remarkably successful on many natural language processing tas...
Recent work by Siegelmann has shown that the computational power of recurrent neural networks matche...
We present novel methods for analyzing the activation patterns of recurrent neural networks from a l...
Recurrent Neural Networks are an effective and prevalent tool used to model sequential data such as ...
Recently researchers have derived formal complexity analysis of analog computation in the setting of...
Recently researchers have derived formal complexity analysis of analog computation in the setting of...
Recurrent neural networks (RNNs) have achieved impressive results in a variety of linguistic process...
"Artificial neural networks" provide an appealing model of computation. Such networks consist of an ...
Time series often have a temporal hierarchy, with information that is spread out over multiple time ...
As potential candidates for explaining human cognition, connectionist models of sentence processing ...
We present a biologically inspired computational framework for language processing and grammar acqui...
We examine two connectionist networks—a fractal learning neural network (FLNN) and a Sim-ple Recurre...
Time series often have a temporal hierarchy, with information that is spread out over multiple time ...
We investigate how neural networks can learn and process languages with hierarchical, compositional ...
We investigate how neural networks can be used for hierarchical, compositional semantics. To this en...
Artificial neural networks have become remarkably successful on many natural language processing tas...
Recent work by Siegelmann has shown that the computational power of recurrent neural networks matche...
We present novel methods for analyzing the activation patterns of recurrent neural networks from a l...
Recurrent Neural Networks are an effective and prevalent tool used to model sequential data such as ...
Recently researchers have derived formal complexity analysis of analog computation in the setting of...
Recently researchers have derived formal complexity analysis of analog computation in the setting of...
Recurrent neural networks (RNNs) have achieved impressive results in a variety of linguistic process...
"Artificial neural networks" provide an appealing model of computation. Such networks consist of an ...
Time series often have a temporal hierarchy, with information that is spread out over multiple time ...
As potential candidates for explaining human cognition, connectionist models of sentence processing ...
We present a biologically inspired computational framework for language processing and grammar acqui...
We examine two connectionist networks—a fractal learning neural network (FLNN) and a Sim-ple Recurre...
Time series often have a temporal hierarchy, with information that is spread out over multiple time ...