Designing learning systems which are invariant to certain data transformations is critical in machine learning. Practitioners can typically enforce a desired invariance on the trained model through the choice of a network architecture, e.g. using convolutions for translations, or using data augmentation. Yet, enforcing true invariance in the network can be difficult, and data invariances are not always known a piori. State-of-the-art methods for learning data augmentation policies require held-out data and are based on bilevel optimization problems, which are complex to solve and often computationally demanding. In this work we investigate new ways of learning invariances only from the training data. Using learnable augmentation layers buil...
It is often said that a deep learning model is "invariant" to some specific type of transformation. ...
Underlying data structures, such as symmetries or invariances to transformations, are often exploite...
Classic algorithms and machine learning systems like neural networks are both abundant in everyday l...
Designing learning systems which are invariant to certain data transformations is critical in machin...
For many pattern recognition tasks, the ideal input feature would be invariant to multiple confoundi...
ConvNets, through their architecture, only enforce invariance to translation. In this paper, we intr...
Inspired by two basic mechanisms in animal visual systems, we introduce a feature transform techniqu...
In this thesis, Invariance in Deep Representations, we propose novel solutions to the problem of lea...
Machine learning is concerned with computer systems that learn from data instead of being explicitly...
Data symmetries have been used to successfully learn robust and optimal representation either via au...
Aside from developing methods to embed the equivariant priors into the architectures, one can also s...
Autonomous learning is demonstrated by living beings that learn visual invariances during their visu...
In this paper, we investigate the principle that good explanations are hard to vary in the context o...
Assumptions about invariances or symmetries in data can significantly increase the predictive power ...
In many machine learning applications, one has access, not only to training data, but also to some h...
It is often said that a deep learning model is "invariant" to some specific type of transformation. ...
Underlying data structures, such as symmetries or invariances to transformations, are often exploite...
Classic algorithms and machine learning systems like neural networks are both abundant in everyday l...
Designing learning systems which are invariant to certain data transformations is critical in machin...
For many pattern recognition tasks, the ideal input feature would be invariant to multiple confoundi...
ConvNets, through their architecture, only enforce invariance to translation. In this paper, we intr...
Inspired by two basic mechanisms in animal visual systems, we introduce a feature transform techniqu...
In this thesis, Invariance in Deep Representations, we propose novel solutions to the problem of lea...
Machine learning is concerned with computer systems that learn from data instead of being explicitly...
Data symmetries have been used to successfully learn robust and optimal representation either via au...
Aside from developing methods to embed the equivariant priors into the architectures, one can also s...
Autonomous learning is demonstrated by living beings that learn visual invariances during their visu...
In this paper, we investigate the principle that good explanations are hard to vary in the context o...
Assumptions about invariances or symmetries in data can significantly increase the predictive power ...
In many machine learning applications, one has access, not only to training data, but also to some h...
It is often said that a deep learning model is "invariant" to some specific type of transformation. ...
Underlying data structures, such as symmetries or invariances to transformations, are often exploite...
Classic algorithms and machine learning systems like neural networks are both abundant in everyday l...