We investigate the use of information from all second order derivatives of the error function to perform network pruning (i.e., removing unimportant weights from a trained network) in order to improve generalization, simplify networks, reduce hardware or storage requirements, increase the speed of further training, and in some cases enable rule extraction. Our method, Optimal Brain Surgeon (OBS), is Significantly better than magnitude-based methods and Optimal Brain Damage [Le Cun, Denker and Sol1a, 1990], which often remove the wrong weights. OBS permits the pruning of more weights than other methods (for the same error on the training set), and thus yields better generalization on test data. Crucial to OBS is a recursion relation for ...
Artificial neural networks (ANN) are well known for their good classification abilities. Recent adva...
The optimal brain surgeon (OBS) pruning procedure for automatic selection of the optimal neural netw...
A critical question in the neural network research today concerns how many hidden neurons to use. Th...
We investigate the use of information from all second order derivatives of the error function to per...
The use of information from all second-order derivatives of the error function to perform network pr...
We extend Optimal Brain Surgeon (OBS) - a second-order method for pruning networks - to allow for ge...
Neural networks tend to achieve better accuracy with training if they are larger -- even if the resu...
How to develop slim and accurate deep neural networks has become crucial for real- world application...
Colloque avec actes et comité de lecture. internationale.International audienceThis paper presents t...
Choosing a proper neural network architecture is a problem of great practical importance. Smaller mo...
Backpropagation (BP) Neural Network (NN) error functions enable the mapping of data vectors to user-...
Choosing a suitable topology for a neural network, given an application, is a difficult problem. Usu...
Reducing a neural network\u27s complexity improves the ability of the network to be applied to futur...
Introduction Training algorithms for Multilayer Perceptrons optimize the set of W weights and biase...
Colloque avec actes et comité de lecture. internationale.International audienceThe determination of ...
Artificial neural networks (ANN) are well known for their good classification abilities. Recent adva...
The optimal brain surgeon (OBS) pruning procedure for automatic selection of the optimal neural netw...
A critical question in the neural network research today concerns how many hidden neurons to use. Th...
We investigate the use of information from all second order derivatives of the error function to per...
The use of information from all second-order derivatives of the error function to perform network pr...
We extend Optimal Brain Surgeon (OBS) - a second-order method for pruning networks - to allow for ge...
Neural networks tend to achieve better accuracy with training if they are larger -- even if the resu...
How to develop slim and accurate deep neural networks has become crucial for real- world application...
Colloque avec actes et comité de lecture. internationale.International audienceThis paper presents t...
Choosing a proper neural network architecture is a problem of great practical importance. Smaller mo...
Backpropagation (BP) Neural Network (NN) error functions enable the mapping of data vectors to user-...
Choosing a suitable topology for a neural network, given an application, is a difficult problem. Usu...
Reducing a neural network\u27s complexity improves the ability of the network to be applied to futur...
Introduction Training algorithms for Multilayer Perceptrons optimize the set of W weights and biase...
Colloque avec actes et comité de lecture. internationale.International audienceThe determination of ...
Artificial neural networks (ANN) are well known for their good classification abilities. Recent adva...
The optimal brain surgeon (OBS) pruning procedure for automatic selection of the optimal neural netw...
A critical question in the neural network research today concerns how many hidden neurons to use. Th...