We propose self-adaptive training -- a unified training algorithm that dynamically calibrates and enhances training processes by model predictions without incurring an extra computational cost -- to advance both supervised and self-supervised learning of deep neural networks. We analyze the training dynamics of deep networks on training data that are corrupted by, e.g., random noise and adversarial examples. Our analysis shows that model predictions are able to magnify useful underlying information in data and this phenomenon occurs broadly even in the absence of any label information, highlighting that model predictions could substantially benefit the training processes: self-adaptive training improves the generalization of deep networks u...
Computational models of learning typically train on labeled input patterns (supervised learning), un...
Prior works on self-supervised pre-training focus on the joint training scenario, where massive unla...
Whilst computer vision models built using self-supervised approaches are now commonplace, some impo...
Deep neural networks achieve remarkable performances on a wide range of tasks with the aid of large-...
Recently the surprising discovery of the Bootstrap Your Own Latent (BYOL) method by Grill et al. sho...
The recent rise in machine learning has been largely made possible by novel algorithms, such as con...
In the last decade, motivated by the success of Deep Learning, the scientific community proposed sev...
In the last decade, motivated by the success of Deep Learning, the scientific community proposed sev...
In the last decade, motivated by the success of Deep Learning, the scientific community proposed sev...
Self-supervised representation learning methods aim to provide powerful deep feature learning withou...
Training deep neural networks (DNNs) with limited supervision has been a popular research topic as i...
Due to numerous breakthroughs in real-world applications brought by machine intelligence, deep neura...
Whilst computer vision models built using self-supervised approaches are now commonplace, some impo...
Incremental learning requires a learning model to learn new tasks without forgetting the learned tas...
The human brain has billions of neurons. However, we perform tasks using only a few concurrently act...
Computational models of learning typically train on labeled input patterns (supervised learning), un...
Prior works on self-supervised pre-training focus on the joint training scenario, where massive unla...
Whilst computer vision models built using self-supervised approaches are now commonplace, some impo...
Deep neural networks achieve remarkable performances on a wide range of tasks with the aid of large-...
Recently the surprising discovery of the Bootstrap Your Own Latent (BYOL) method by Grill et al. sho...
The recent rise in machine learning has been largely made possible by novel algorithms, such as con...
In the last decade, motivated by the success of Deep Learning, the scientific community proposed sev...
In the last decade, motivated by the success of Deep Learning, the scientific community proposed sev...
In the last decade, motivated by the success of Deep Learning, the scientific community proposed sev...
Self-supervised representation learning methods aim to provide powerful deep feature learning withou...
Training deep neural networks (DNNs) with limited supervision has been a popular research topic as i...
Due to numerous breakthroughs in real-world applications brought by machine intelligence, deep neura...
Whilst computer vision models built using self-supervised approaches are now commonplace, some impo...
Incremental learning requires a learning model to learn new tasks without forgetting the learned tas...
The human brain has billions of neurons. However, we perform tasks using only a few concurrently act...
Computational models of learning typically train on labeled input patterns (supervised learning), un...
Prior works on self-supervised pre-training focus on the joint training scenario, where massive unla...
Whilst computer vision models built using self-supervised approaches are now commonplace, some impo...