[[abstract]]In non-batch learning systems, an index called plasticity is needed to indicate how easy an instance can be assimilated. Basically, plasticity should be able to illustrate three essential elements: what degree of modification is allowed in a learning system, how close the learning system actually adapts and how much learning effort is taken in response to an incoming instance. By taking those three notions into consideration, we proposed a new formula to evaluate the plasticity of feedforward neural networks trained with non-batch learning. The formula was investigated against on-line backpropagation [1], adaptive learning [2,3] and Incremental Feedforward Networks (IFFN) [3] in handling both consistent and inconsistent instance...
[[abstract]]In this paper, we applied the concepts of minimizing weight sensitivity cost and trainin...
Abstract — Numerical condition affects the learning speed and accuracy of most artificial neural net...
We show how a feed-forward neural network can be sucessfully trained by using a simulated annealing ...
Plasticity is the ability to adapt, the ability to shape a model. In order to better understand how ...
One aim shared by multiple settings, such as continual learning or transfer learning, is to leverage...
Plastic neural networks have the ability to adapt to new tasks. However, in a continual learning set...
It has often been noted that the learning problem in feed-forward neural networks is very badly cond...
The goal of data mining is to solve various problems dealing with knowledge extraction from huge amo...
Abstract — We have recently proposed a novel neural network structure called an “Affordable Neural N...
Constructive learning algorithms are an efficient way to train feedforward neural networks. Some of ...
AbstractThis paper investigates the functional invariance of neural network learning methods incorpo...
Learning the gradient of neuron's activity function like the weight of links causes a new specificat...
Plasticity, the ability of a neural network to quickly change its predictions in response to new inf...
The search for biologically faithful synaptic plasticity rules has resulted in a large body of model...
A major goal of bio-inspired artificial intelligence is to design artificial neural networks with ab...
[[abstract]]In this paper, we applied the concepts of minimizing weight sensitivity cost and trainin...
Abstract — Numerical condition affects the learning speed and accuracy of most artificial neural net...
We show how a feed-forward neural network can be sucessfully trained by using a simulated annealing ...
Plasticity is the ability to adapt, the ability to shape a model. In order to better understand how ...
One aim shared by multiple settings, such as continual learning or transfer learning, is to leverage...
Plastic neural networks have the ability to adapt to new tasks. However, in a continual learning set...
It has often been noted that the learning problem in feed-forward neural networks is very badly cond...
The goal of data mining is to solve various problems dealing with knowledge extraction from huge amo...
Abstract — We have recently proposed a novel neural network structure called an “Affordable Neural N...
Constructive learning algorithms are an efficient way to train feedforward neural networks. Some of ...
AbstractThis paper investigates the functional invariance of neural network learning methods incorpo...
Learning the gradient of neuron's activity function like the weight of links causes a new specificat...
Plasticity, the ability of a neural network to quickly change its predictions in response to new inf...
The search for biologically faithful synaptic plasticity rules has resulted in a large body of model...
A major goal of bio-inspired artificial intelligence is to design artificial neural networks with ab...
[[abstract]]In this paper, we applied the concepts of minimizing weight sensitivity cost and trainin...
Abstract — Numerical condition affects the learning speed and accuracy of most artificial neural net...
We show how a feed-forward neural network can be sucessfully trained by using a simulated annealing ...