Abstract
Spiking neural networks (SNNs) have captivated the attention worldwide owing to their compelling advantages in low power consumption, high biological plausibility, and strong robustness. However, the intrinsic latency associated with SNNs during inference poses a significant challenge, impeding their further development and application. This latency is caused by the need for spiking neurons to collect electrical stimuli and generate spikes only when their membrane potential exceeds a firing threshold. Considering the firing threshold plays a crucial role in SNN performance, this article proposes a self-driven adaptive threshold plasticity (SATP) mechanism, wherein neurons autonomously adjust the firing thresholds based on their individual state information using unsupervised learning rules, of which the adjustment is triggered by their own firing events. SATP is based on the principle of maximizing the information contained in the output spike rate distribution of each neuron. This article derives the mathematical expression of SATP and provides extensive experimental results, demonstrating that SATP effectively reduces SNN inference latency, further reduces the computation density while improving computational accuracy, so that SATP facilitates SNN models to be with low latency, sparse computing, and high accuracy.
Original language | English |
---|---|
Pages (from-to) | 1-12 |
Number of pages | 12 |
Journal | IEEE Transactions on Neural Networks and Learning Systems |
DOIs | |
Publication status | Published - Aug 2023 |
Keywords
- Adaptive systems
- Biological neural networks
- Firing
- Low latency communication
- Low latency inference
- Membrane potentials
- neuronal firing threshold
- Neurons
- self-driven adaptive threshold plasticity (SATP)
- sparse computing
- spiking neural network (SNN)
- Training
ASJC Scopus subject areas
- Software
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence