The performance of supervised learners depends on the presence of a relatively large labeled sample. This paper proposes an automatic ongoing learning system, which is able to incorporate new knowledge from the experience obtained when classifying new objects and correspondingly, to improve the efficiency of the system. We employ a stochastic rule for classifying and editing, along with a condensing algorithm based on local density to forget superfluous data (and control the sample size). The effectiveness of the algorithm is experimentally evaluated using a number of data sets taken from the UCI Machine Learning Database Repository
Storing and using specific instances improves the performance of several supervised learning algorit...
One of the most pervasive challenges in adopting machine or deep learning is the scarcity of trainin...
The amount of electronic information as well as the size and dimensionality of data sets have increa...
The performance of supervised learners depends on the presence of a relatively large labeled sample...
Memory-based learning algorithms lack a mechanism for tracking time-varying associative mappings. To...
The ability of artificial agents to increment their capabilities when confronted with new data is an...
Abstract—Today’s systems produce a rapidly exploding amount of data, and the data further derives mo...
International audienceThe ability of artificial agents to increment their capabilities when confront...
Catastrophic forgetting, the phenomenon of forgetting previously learned tasks when learning a new o...
This paper presents a novel machine learning algorithm with an improved accuracy and a faster learni...
We describe a density-adaptive reinforcement learning and a density-adaptive forgetting algorithm. ...
We introduce a framework for learning and recognition of concepts when the num-ber of concepts is la...
International audienceThis paper makes a contribution to the problem of incremental class learning, ...
Continual learning is a Machine Learning paradigm that studies the problem of learning from a potent...
Object detection concerns the classification and localization of objects in an image. To cope with c...
Storing and using specific instances improves the performance of several supervised learning algorit...
One of the most pervasive challenges in adopting machine or deep learning is the scarcity of trainin...
The amount of electronic information as well as the size and dimensionality of data sets have increa...
The performance of supervised learners depends on the presence of a relatively large labeled sample...
Memory-based learning algorithms lack a mechanism for tracking time-varying associative mappings. To...
The ability of artificial agents to increment their capabilities when confronted with new data is an...
Abstract—Today’s systems produce a rapidly exploding amount of data, and the data further derives mo...
International audienceThe ability of artificial agents to increment their capabilities when confront...
Catastrophic forgetting, the phenomenon of forgetting previously learned tasks when learning a new o...
This paper presents a novel machine learning algorithm with an improved accuracy and a faster learni...
We describe a density-adaptive reinforcement learning and a density-adaptive forgetting algorithm. ...
We introduce a framework for learning and recognition of concepts when the num-ber of concepts is la...
International audienceThis paper makes a contribution to the problem of incremental class learning, ...
Continual learning is a Machine Learning paradigm that studies the problem of learning from a potent...
Object detection concerns the classification and localization of objects in an image. To cope with c...
Storing and using specific instances improves the performance of several supervised learning algorit...
One of the most pervasive challenges in adopting machine or deep learning is the scarcity of trainin...
The amount of electronic information as well as the size and dimensionality of data sets have increa...