The purpose of this short and simple note is to clarify a common misconception about convolutional neural networks (CNNs). CNNs are made up of convolutional layers which are shift equivariant due to weight sharing. However, convolutional layers are not translation equivariant, even when boundary effects are ignored and when pooling and subsampling are absent. This is because shift equivariance is a discrete symmetry while translation equivariance is a continuous symmetry. This fact is well known among researchers in equivariant machine learning, but is usually overlooked among non-experts. To minimize confusion, we suggest using the term `shift equivariance' to refer to discrete shifts in pixels and `translation equivariance' to refer to co...
National audienceOver the past decade, some progress has been made on understanding the strengths an...
The principle of equivariance to symmetry transformations enables a theoretically grounded approach ...
Despite the importance of image representations such as histograms of oriented gradients and deep Co...
Equivariances provide useful inductive biases in neural network modeling, with the translation equiv...
The convolutional layers of standard convolutional neural networks (CNNs) are equivariant to transla...
Aside from developing methods to embed the equivariant priors into the architectures, one can also s...
We study the approximation of shift-invariant or equivariant functions by deep fully convolutional n...
We introduce the Convolutional Conditional Neural Process (ConvCNP), a new member of the Neural Proc...
Yes, convolutional neural networks are domain-invariant, albeit to some limited extent. We explored ...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusAlthough Conv...
We show that spatial transformations of CNN feature maps cannot align the feature maps of a transfor...
In this thesis we have looked into the complexity of neural networks. Especially convolutional neura...
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship ...
Deep neural networks can solve many kinds of learning problems, but only if a lot of data is availab...
Despite significant advancements in computer vision over the past decade, convolutional neural netwo...
National audienceOver the past decade, some progress has been made on understanding the strengths an...
The principle of equivariance to symmetry transformations enables a theoretically grounded approach ...
Despite the importance of image representations such as histograms of oriented gradients and deep Co...
Equivariances provide useful inductive biases in neural network modeling, with the translation equiv...
The convolutional layers of standard convolutional neural networks (CNNs) are equivariant to transla...
Aside from developing methods to embed the equivariant priors into the architectures, one can also s...
We study the approximation of shift-invariant or equivariant functions by deep fully convolutional n...
We introduce the Convolutional Conditional Neural Process (ConvCNP), a new member of the Neural Proc...
Yes, convolutional neural networks are domain-invariant, albeit to some limited extent. We explored ...
MEng (Computer and Electronic Engineering), North-West University, Potchefstroom CampusAlthough Conv...
We show that spatial transformations of CNN feature maps cannot align the feature maps of a transfor...
In this thesis we have looked into the complexity of neural networks. Especially convolutional neura...
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship ...
Deep neural networks can solve many kinds of learning problems, but only if a lot of data is availab...
Despite significant advancements in computer vision over the past decade, convolutional neural netwo...
National audienceOver the past decade, some progress has been made on understanding the strengths an...
The principle of equivariance to symmetry transformations enables a theoretically grounded approach ...
Despite the importance of image representations such as histograms of oriented gradients and deep Co...