International audienceWhich generative model is the most suitable for Continual Learning? This paper aims at evaluating and comparing generative models on disjoint sequential image generation tasks. We investigate how several models learn and forget, considering various strategies: rehearsal, regularization, generative replay and fine-tuning. We used two quantitative metrics to estimate the generation quality and memory ability. We experiment with sequential tasks on three commonly used benchmarks for Continual Learning (MNIST, Fashion MNIST). We found that among all models, the original GAN performs best and among Continual Learning strategies, gener-ative replay outperforms all other methods
Catastrophic forgetting (CF) happens whenever a neural network overwrites past knowledge while being...
Learning and adapting to new distributions or learning new tasks sequentially without forgetting the...
The ability of a model to learn continually can be empirically assessed in different continual learn...
With the capacity of continual learning, humans can continuously acquire knowledge throughout their ...
Training neural networks on newly available data leads to catastrophic forgetting of previously lear...
Learning continually is a key aspect of intelligence and a necessary ability to solve many real-life...
Neural networks are prone to catastrophic forgetting when trained incrementally on different tasks. ...
Humans learn all their life long. They accumulate knowledge from a sequence of learning experiences ...
Continual learning is the ability to sequentially learn over time by accommodating knowledge while r...
Continual Learning (CL) allows artificial neural networks to learn a sequence of tasks without catas...
Although deep learning models have achieved significant successes in various fields, most of them ha...
After learning a concept, humans are also able to continually generalize their learned concepts to n...
In contrast to batch learning where all training data is available at once, continual learning repre...
Learning continuously during all model lifetime is fundamental to deploy machine learning solutions ...
This thesis deals with deep learning applied to image classification tasks. The primary motivation f...
Catastrophic forgetting (CF) happens whenever a neural network overwrites past knowledge while being...
Learning and adapting to new distributions or learning new tasks sequentially without forgetting the...
The ability of a model to learn continually can be empirically assessed in different continual learn...
With the capacity of continual learning, humans can continuously acquire knowledge throughout their ...
Training neural networks on newly available data leads to catastrophic forgetting of previously lear...
Learning continually is a key aspect of intelligence and a necessary ability to solve many real-life...
Neural networks are prone to catastrophic forgetting when trained incrementally on different tasks. ...
Humans learn all their life long. They accumulate knowledge from a sequence of learning experiences ...
Continual learning is the ability to sequentially learn over time by accommodating knowledge while r...
Continual Learning (CL) allows artificial neural networks to learn a sequence of tasks without catas...
Although deep learning models have achieved significant successes in various fields, most of them ha...
After learning a concept, humans are also able to continually generalize their learned concepts to n...
In contrast to batch learning where all training data is available at once, continual learning repre...
Learning continuously during all model lifetime is fundamental to deploy machine learning solutions ...
This thesis deals with deep learning applied to image classification tasks. The primary motivation f...
Catastrophic forgetting (CF) happens whenever a neural network overwrites past knowledge while being...
Learning and adapting to new distributions or learning new tasks sequentially without forgetting the...
The ability of a model to learn continually can be empirically assessed in different continual learn...