As the third-generation neural network, the Spiking Neural Network (SNN) has the advantages of low power consumption and high energy efficiency, making it suitable for implementation on edge devices. More recently, the most advanced SNN, Spikformer, combines the self-attention module from Transformer with SNN to achieve remarkable performance. However, it adopts larger channel dimensions in MLP layers, leading to an increased number of redundant model parameters. To effectively decrease the computational complexity and weight parameters of the model, we explore the Lottery Ticket Hypothesis (LTH) and discover a very sparse ($\ge$90%) subnetwork that achieves comparable performance to the original network. Furthermore, we also design a light...
Spiking Neural Networks (SNNs) provide an energy-efficient deep learning option due to their unique ...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
"Sparse" neural networks, in which relatively few neurons or connections are active, are common in b...
We consider two biologically plausible structures, the Spiking Neural Network (SNN) and the self-att...
Over the past few years, Spiking Neural Networks (SNNs) have become popular as a possible pathway to...
Spiking neural networks (SNNs) are promising alternatives to artificial neural networks (ANNs) since...
Spiking neural networks (SNNs) have received substantial attention in recent years due to their spar...
Advancements in computer vision research have put transformer architecture as the state of the art i...
Spiking neural networks (SNNs) have manifested remarkable advantages in power consumption and event-...
Recently, Vision Transformer (ViT) has continuously established new milestones in the computer visio...
abstract: Hardware implementation of deep neural networks is earning significant importance nowadays...
In the last few years, spiking neural networks (SNNs) have been demonstrated to perform on par with ...
The training of sparse neural networks is becoming an increasingly important tool for reducing the ...
Sparsifying the Transformer has garnered considerable interest, as training the Transformer is very ...
Spiking neural networks (SNNs) have made great progress on both performance and efficiency over the ...
Spiking Neural Networks (SNNs) provide an energy-efficient deep learning option due to their unique ...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
"Sparse" neural networks, in which relatively few neurons or connections are active, are common in b...
We consider two biologically plausible structures, the Spiking Neural Network (SNN) and the self-att...
Over the past few years, Spiking Neural Networks (SNNs) have become popular as a possible pathway to...
Spiking neural networks (SNNs) are promising alternatives to artificial neural networks (ANNs) since...
Spiking neural networks (SNNs) have received substantial attention in recent years due to their spar...
Advancements in computer vision research have put transformer architecture as the state of the art i...
Spiking neural networks (SNNs) have manifested remarkable advantages in power consumption and event-...
Recently, Vision Transformer (ViT) has continuously established new milestones in the computer visio...
abstract: Hardware implementation of deep neural networks is earning significant importance nowadays...
In the last few years, spiking neural networks (SNNs) have been demonstrated to perform on par with ...
The training of sparse neural networks is becoming an increasingly important tool for reducing the ...
Sparsifying the Transformer has garnered considerable interest, as training the Transformer is very ...
Spiking neural networks (SNNs) have made great progress on both performance and efficiency over the ...
Spiking Neural Networks (SNNs) provide an energy-efficient deep learning option due to their unique ...
The growing energy and performance costs of deep learning have driven the community to reduce the si...
"Sparse" neural networks, in which relatively few neurons or connections are active, are common in b...