Sparsity is commonly produced from model compression (i.e., pruning), which eliminates unnecessary parameters. Beyond the improved resource efficiency, sparsity also serves as an important tool to model the underlying low dimensionality of neural networks, for understanding their generalization, optimization dynamics, implicit regularization, expressivity, and robustness. Meanwhile, appropriate sparsity-aware priors assist deep neural networks to achieve significantly enhanced performances on algorithms and systems. This dissertation studies it from two intertwined perspectives, (i) efficient and reliable sparsity and (ii) the sparsity for science. In the first part of this thesis (chapters 2 and 3), a few efforts that are devoted to improv...
"Sparse" neural networks, in which relatively few neurons or connections are active, are common in b...
In trained deep neural networks, unstructured pruning can reduce redundant weights to lower storage ...
The rapid development of modern information technology has significantly facilitated the generation,...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
Deep learning has been empirically successful in recent years thanks to the extremely over-parameter...
Sparsity plays a key role in machine learning for several reasons including interpretability. Interp...
Hardware accelerators for neural network inference can exploit common data properties for performanc...
Arguably one of the most notable forms of the principle of parsimony was formulated by the philosoph...
Deep learning is finding its way into the embedded world with applications such as autonomous drivin...
Deep neural nets (DNNs) compression is crucial for adaptation to mobile devices. Though many success...
Sparse representation plays a critical role in vision problems, including generation and understandi...
The articles in this special section focus on learning adaptive models. Over the past few years, spa...
The ever-increasing number of parameters in deep neural networks poses challenges for memory-limited...
Efficient machine learning implementations optimized for inference in hardware have wide-ranging ben...
Modern Machine learning techniques take advantage of the exponentially rising calculation power in n...
"Sparse" neural networks, in which relatively few neurons or connections are active, are common in b...
In trained deep neural networks, unstructured pruning can reduce redundant weights to lower storage ...
The rapid development of modern information technology has significantly facilitated the generation,...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
Deep learning has been empirically successful in recent years thanks to the extremely over-parameter...
Sparsity plays a key role in machine learning for several reasons including interpretability. Interp...
Hardware accelerators for neural network inference can exploit common data properties for performanc...
Arguably one of the most notable forms of the principle of parsimony was formulated by the philosoph...
Deep learning is finding its way into the embedded world with applications such as autonomous drivin...
Deep neural nets (DNNs) compression is crucial for adaptation to mobile devices. Though many success...
Sparse representation plays a critical role in vision problems, including generation and understandi...
The articles in this special section focus on learning adaptive models. Over the past few years, spa...
The ever-increasing number of parameters in deep neural networks poses challenges for memory-limited...
Efficient machine learning implementations optimized for inference in hardware have wide-ranging ben...
Modern Machine learning techniques take advantage of the exponentially rising calculation power in n...
"Sparse" neural networks, in which relatively few neurons or connections are active, are common in b...
In trained deep neural networks, unstructured pruning can reduce redundant weights to lower storage ...
The rapid development of modern information technology has significantly facilitated the generation,...