LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks with TTFS Coding

Jibin Wu, Kay Chen Tan, Haizhou Li, Qu Yang, Malu Zhang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

Abstract

The biological neurons use precise spike times, in addition to the spike firing rate, to communicate with each other. The time-to-first-spike (TTFS) coding is inspired by such biological observation. However, there is a lack of effective solutions for training TTFS-based spiking neural network (SNN). In this paper, we put forward a simple yet effective network conversion algorithm, which is referred to as LC-TTFS, by addressing two main problems that hinder an effective conversion from a high-performance artificial neural network (ANN) to a TTFS-based SNN. We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks, including image classification, image reconstruction, and speech enhancement. With TTFS coding, we can achieve up to orders of magnitude saving in computation over ANN and other rate-based SNNs. The study, therefore, paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.

Original languageEnglish
Pages (from-to)1-14
Number of pages14
JournalIEEE Transactions on Cognitive and Developmental Systems
DOIs
Publication statusAccepted/In press - 2023

Keywords

  • ANN-to-SNN Conversion
  • Artificial neural networks
  • Biological neural networks
  • Computational modeling
  • Deep Spiking Neural Network
  • Encoding
  • Firing
  • Image Classification
  • Image Reconstruction
  • Neurons
  • Speech Enhancement
  • Task analysis
  • Time-to-first-spike Coding

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks with TTFS Coding'. Together they form a unique fingerprint.

Cite this