Neural networks have to capture mathematical relationships in order to learn various tasks. They approximate these relations implicitly and therefore often do not generalize well. The recently proposed Neural Arithmetic Logic Unit (NALU) is a novel neural architecture which is able to explicitly represent the mathematical relationships by the units of the network to learn operations such as summation, subtraction or multiplication. Although NALUs have been shown to perform well on various downstream tasks, an in-depth analysis reveals practical shortcomings by design, such as the inability to multiply or divide negative input values or training stability issues for deeper networks. We address these issues and propose an improved model archi...
Abstract—Interval arithmetic has become a popular tool for general optimization problems such as rob...
Reconfigurable architectures targeting neural networks are an attractive option. They allow multiple...
This fMRI study aimed at unraveling the neural basis of learning alphabet arithmetic facts, as a pro...
Neural Arithmetic Logic Modules have become a growing area of interest, though remain a niche field....
Neural networks can learn to represent and manipulate numerical information, but they seldom general...
Neural networks can learn to represent and manipulate numerical information, but they seldom general...
Neural networks are not great generalizers outside their training range i.e. they are good at captur...
Neural networks can learn complex functions, but they often have troubles with extrapolating even si...
International audienceDeep neural networks are difficult to train when applied to tasks that can be ...
Of the four fundamental arithmetic operations (+, -, $\times$, $\div$), division is considered the m...
A simple Neural Network model is presented for end-to-end visual learning of arithmetic operations f...
Answering complex questions that require multi-step multi-type reasoning over raw text is challengin...
A neuron is modeled as a linear threshold gate, and the network architecture considered is the layer...
What does it mean for a neural network to become a “cardinal principal knower”? We trained a multila...
This fMRI study aimed at unraveling the neural basis of learning alphabet arithmetic facts, as a pro...
Abstract—Interval arithmetic has become a popular tool for general optimization problems such as rob...
Reconfigurable architectures targeting neural networks are an attractive option. They allow multiple...
This fMRI study aimed at unraveling the neural basis of learning alphabet arithmetic facts, as a pro...
Neural Arithmetic Logic Modules have become a growing area of interest, though remain a niche field....
Neural networks can learn to represent and manipulate numerical information, but they seldom general...
Neural networks can learn to represent and manipulate numerical information, but they seldom general...
Neural networks are not great generalizers outside their training range i.e. they are good at captur...
Neural networks can learn complex functions, but they often have troubles with extrapolating even si...
International audienceDeep neural networks are difficult to train when applied to tasks that can be ...
Of the four fundamental arithmetic operations (+, -, $\times$, $\div$), division is considered the m...
A simple Neural Network model is presented for end-to-end visual learning of arithmetic operations f...
Answering complex questions that require multi-step multi-type reasoning over raw text is challengin...
A neuron is modeled as a linear threshold gate, and the network architecture considered is the layer...
What does it mean for a neural network to become a “cardinal principal knower”? We trained a multila...
This fMRI study aimed at unraveling the neural basis of learning alphabet arithmetic facts, as a pro...
Abstract—Interval arithmetic has become a popular tool for general optimization problems such as rob...
Reconfigurable architectures targeting neural networks are an attractive option. They allow multiple...
This fMRI study aimed at unraveling the neural basis of learning alphabet arithmetic facts, as a pro...