Most existing pruning works are resource-intensive, requiring retraining or fine-tuning of the pruned models for accuracy. We propose a retraining-free pruning method based on hyperspherical learning and loss penalty terms. The proposed loss penalty term pushes some of the model weights far from zero, while the rest weight values are pushed near zero and can be safely pruned with no need for retraining and a negligible accuracy drop. In addition, our proposed method can instantly recover the accuracy of a pruned model by replacing the pruned values with their mean value. Our method obtains state-of-the-art results in retraining-free pruning and is evaluated on ResNet-18/50 and MobileNetV2 with ImageNet dataset. One can easily get a 50\% pru...
International audienceThe training process of a neural network is the most time-consuming procedure ...
Modern deep neural networks require a significant amount of computing time and power to train and de...
Image restoration tasks have witnessed great performance improvement in recent years by developing l...
Neural networks performance has been significantly improved in the last few years, at the cost of an...
When deploying pre-trained neural network models in real-world applications, model consumers often e...
Network pruning is an important research field aiming at reducing computational costs of neural netw...
Large-scale pre-trained models have been remarkably successful in resolving downstream tasks. Noneth...
Convolutional Neural Networks (CNNs) have a large number of parameters and take significantly large ...
Introducing sparsity in a neural network has been an efficient way to reduce its complexity while ke...
Channel pruning is effective in compressing the pretrained CNNs for their deployment on low-end edge...
Works on lottery ticket hypothesis (LTH) and single-shot network pruning (SNIP) have raised a lot of...
4noIn recent years, Artificial Neural Networks (ANNs) pruning has become the focal point of many res...
In recent years, deep neural networks have achieved remarkable results in various artificial intelli...
DNNs are highly memory and computationally intensive, due to which they are unfeasible to depl...
In recent years, neural networks have regained popularity in a variety of fields such as image recog...
International audienceThe training process of a neural network is the most time-consuming procedure ...
Modern deep neural networks require a significant amount of computing time and power to train and de...
Image restoration tasks have witnessed great performance improvement in recent years by developing l...
Neural networks performance has been significantly improved in the last few years, at the cost of an...
When deploying pre-trained neural network models in real-world applications, model consumers often e...
Network pruning is an important research field aiming at reducing computational costs of neural netw...
Large-scale pre-trained models have been remarkably successful in resolving downstream tasks. Noneth...
Convolutional Neural Networks (CNNs) have a large number of parameters and take significantly large ...
Introducing sparsity in a neural network has been an efficient way to reduce its complexity while ke...
Channel pruning is effective in compressing the pretrained CNNs for their deployment on low-end edge...
Works on lottery ticket hypothesis (LTH) and single-shot network pruning (SNIP) have raised a lot of...
4noIn recent years, Artificial Neural Networks (ANNs) pruning has become the focal point of many res...
In recent years, deep neural networks have achieved remarkable results in various artificial intelli...
DNNs are highly memory and computationally intensive, due to which they are unfeasible to depl...
In recent years, neural networks have regained popularity in a variety of fields such as image recog...
International audienceThe training process of a neural network is the most time-consuming procedure ...
Modern deep neural networks require a significant amount of computing time and power to train and de...
Image restoration tasks have witnessed great performance improvement in recent years by developing l...