We study the impact of different pruning techniques on the representation learned by deep neural networks trained with contrastive loss functions. Our work finds that at high sparsity levels, contrastive learning results in a higher number of misclassified examples relative to models trained with traditional cross-entropy loss. To understand this pronounced difference, we use metrics such as the number of PIEs (Hooker et al., 2019), Q-Score (Kalibhat et al., 2022), and PD-Score (Baldock et al., 2021) to measure the impact of pruning on the learned representation quality. Our analysis suggests the schedule of the pruning method implementation matters. We find that the negative impact of sparsity on the quality of the learned representation i...
Self-supervised Contrastive Learning (CL) has been recently shown to be very effective in preventing...
Deep Convolution Neural Networks (CNNs) have been widely used in image recognition, while models of ...
Contrastive learning aims to extract distinctive features from data by finding an embedding represen...
We study the impact of different pruning techniques on the representation learned by deep neural net...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
Deep networks are typically trained with many more parameters than the size of the training dataset....
Pruning is an efficient method for deep neural network model compression and acceleration. However, ...
We investigate filter level sparsity that emerges in convolutional neural networks (CNNs) which empl...
Cross entropy loss has served as the main objective function for classification-based tasks. Widely ...
Works on lottery ticket hypothesis (LTH) and single-shot network pruning (SNIP) have raised a lot of...
It is widely believed that the success of deep networks lies in their ability to learn a meaningful ...
International audienceIntroduced in the late 1980s for generalization purposes, pruning has now beco...
Structural neural network pruning aims to remove the redundant channels in the deep convolutional ne...
Deep learning is finding its way into the embedded world with applications such as autonomous drivin...
Pruning is a method of compressing the size of a neural network model, which affects the accuracy an...
Self-supervised Contrastive Learning (CL) has been recently shown to be very effective in preventing...
Deep Convolution Neural Networks (CNNs) have been widely used in image recognition, while models of ...
Contrastive learning aims to extract distinctive features from data by finding an embedding represen...
We study the impact of different pruning techniques on the representation learned by deep neural net...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
Deep networks are typically trained with many more parameters than the size of the training dataset....
Pruning is an efficient method for deep neural network model compression and acceleration. However, ...
We investigate filter level sparsity that emerges in convolutional neural networks (CNNs) which empl...
Cross entropy loss has served as the main objective function for classification-based tasks. Widely ...
Works on lottery ticket hypothesis (LTH) and single-shot network pruning (SNIP) have raised a lot of...
It is widely believed that the success of deep networks lies in their ability to learn a meaningful ...
International audienceIntroduced in the late 1980s for generalization purposes, pruning has now beco...
Structural neural network pruning aims to remove the redundant channels in the deep convolutional ne...
Deep learning is finding its way into the embedded world with applications such as autonomous drivin...
Pruning is a method of compressing the size of a neural network model, which affects the accuracy an...
Self-supervised Contrastive Learning (CL) has been recently shown to be very effective in preventing...
Deep Convolution Neural Networks (CNNs) have been widely used in image recognition, while models of ...
Contrastive learning aims to extract distinctive features from data by finding an embedding represen...