Artificial neural networks (ANNs) have emerged as hot topics in the research community. Despite the success of ANNs, it is challenging to train and deploy modern ANNs on commodity hardware due to the ever-increasing model size and the unprecedented growth in the data volumes. Particularly for microarray data, the very high dimensionality and the small number of samples make it difficult for machine learning techniques to handle. Furthermore, specialized hardware such as graphics processing unit (GPU) is expensive. Sparse neural networks are the leading approaches to address these challenges. However, off-the-shelf sparsity-inducing techniques either operate from a pretrained model or enforce the sparse structure via binary masks. The traini...
Over the past few years, deep neural networks have been at the center of attention in machine learn...
A fundamental task for artificial intelligence is learning. Deep Neural Networks have proven to cope...
Overparameterized neural networks generalize well but are expensive to train. Ideally, one would lik...
Artificial Neural Networks (ANNs) have emerged as hot topics in the research community. Despite the ...
Recently, sparse training methods have started to be established as a de facto approach for training...
\u3cp\u3eThrough the success of deep learning in various domains, artificial neural networks are cur...
peer reviewedThrough the success of deep learning in various domains, artificial neural networks are...
Large neural networks are very successful in various tasks. However, with limited data, the generali...
Sparse neural networks have been widely applied to reduce the necessary resource requirements to tra...
Graph neural networks have become increasingly popular in recent years due to their ability to natur...
Sparse neural networks attract increasing interest as they exhibit comparable performance to their d...
This thesis presents a few methods to accelerate the inference of Deep Neural Networks that are lar...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
Over the past few years, deep neural networks have been at the center of attention in machine learn...
A fundamental task for artificial intelligence is learning. Deep Neural Networks have proven to cope...
Overparameterized neural networks generalize well but are expensive to train. Ideally, one would lik...
Artificial Neural Networks (ANNs) have emerged as hot topics in the research community. Despite the ...
Recently, sparse training methods have started to be established as a de facto approach for training...
\u3cp\u3eThrough the success of deep learning in various domains, artificial neural networks are cur...
peer reviewedThrough the success of deep learning in various domains, artificial neural networks are...
Large neural networks are very successful in various tasks. However, with limited data, the generali...
Sparse neural networks have been widely applied to reduce the necessary resource requirements to tra...
Graph neural networks have become increasingly popular in recent years due to their ability to natur...
Sparse neural networks attract increasing interest as they exhibit comparable performance to their d...
This thesis presents a few methods to accelerate the inference of Deep Neural Networks that are lar...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
Over the past few years, deep neural networks have been at the center of attention in machine learn...
A fundamental task for artificial intelligence is learning. Deep Neural Networks have proven to cope...
Overparameterized neural networks generalize well but are expensive to train. Ideally, one would lik...