Recent years have seen a paradigm shift towards multi-task learning. This calls for memory and energy-efficient solutions for inference in a multi-task scenario. We propose an algorithm-hardware co-design approach called MIME. MIME reuses the weight parameters of a trained parent task and learns task-specific threshold parameters for inference on multiple child tasks. We find that MIME results in highly memory-efficient DRAM storage of neural-network parameters for multiple tasks compared to conventional multi-task inference. In addition, MIME results in input-dependent dynamic neuronal pruning, thereby enabling energy-efficient inference with higher throughput on a systolic-array hardware. Our experiments with benchmark datasets (child tas...
International audience—Cognitive tasks are essential for the modern applications of electronics, and...
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy cons...
The increasing scale of neural networks and their growing application space have produced demand for...
Deep Neural Networks (DNNs) have achieved great success in a massive number of artificial intelligen...
In recent years, deep neural networks (DNNs) have revolutionized the field of machine learning. DNNs...
Convolutional neural networks (CNN) have become a ubiquitous algorithm with growing applications in ...
Spiking Neural Networks (SNNs) are bio-plausible models that hold great potential for realizing ener...
Artificial Neural Networks (ANNs) are prevalent machine learning models that are applied across vari...
International audienceIn recent years, artificial intelligence has reached significant milestones wi...
Our work seeks to improve and adapt computing systems and machine learning (ML) algorithms to match ...
In recent years, machine learning has very much been a prominent talking point, and is considered by...
Biological agents do not have infinite resources to learn new things. For this reason, a central asp...
The energy efficiency of neuromorphic hardware is greatly affected by the energy of storing, accessi...
On metrics of density and power efficiency, neuromorphic technologies have the potential to surpass ...
Deep neural networks (DNNs) have successfully been applied in many fields in the past decades. Howev...
International audience—Cognitive tasks are essential for the modern applications of electronics, and...
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy cons...
The increasing scale of neural networks and their growing application space have produced demand for...
Deep Neural Networks (DNNs) have achieved great success in a massive number of artificial intelligen...
In recent years, deep neural networks (DNNs) have revolutionized the field of machine learning. DNNs...
Convolutional neural networks (CNN) have become a ubiquitous algorithm with growing applications in ...
Spiking Neural Networks (SNNs) are bio-plausible models that hold great potential for realizing ener...
Artificial Neural Networks (ANNs) are prevalent machine learning models that are applied across vari...
International audienceIn recent years, artificial intelligence has reached significant milestones wi...
Our work seeks to improve and adapt computing systems and machine learning (ML) algorithms to match ...
In recent years, machine learning has very much been a prominent talking point, and is considered by...
Biological agents do not have infinite resources to learn new things. For this reason, a central asp...
The energy efficiency of neuromorphic hardware is greatly affected by the energy of storing, accessi...
On metrics of density and power efficiency, neuromorphic technologies have the potential to surpass ...
Deep neural networks (DNNs) have successfully been applied in many fields in the past decades. Howev...
International audience—Cognitive tasks are essential for the modern applications of electronics, and...
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy cons...
The increasing scale of neural networks and their growing application space have produced demand for...