A Hybrid Neural Coding Approach for Pattern Recognition With Spiking Neural Networks

Research output: Journal article publicationJournal articleAcademic researchpeer-review

Abstract

Recently, brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks. However, these SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation. Given that each neural coding scheme possesses its own merits and drawbacks, these SNNs encounter challenges in achieving optimal performance such as accuracy, response time, efficiency, and robustness, all of which are crucial for practical applications. In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes. As an initial exploration in this direction, we propose a hybrid neural coding and learning framework, which encompasses a neural coding zoo with diverse neural coding schemes discovered in neuroscience. Additionally, it incorporates a flexible neural coding assignment strategy to accommodate task-specific requirements, along with novel layer-wise learning methods to effectively implement hybrid coding SNNs. We demonstrate the superiority of the proposed framework on image classification and sound localization tasks. Specifically, the proposed hybrid coding SNNs achieve comparable accuracy to state-of-the-art SNNs, while exhibiting significantly reduced inference latency and energy consumption, as well as high noise robustness. This study yields valuable insights into hybrid neural coding designs, paving the way for developing high-performance neuromorphic systems.
Original languageEnglish
Pages (from-to)3064 - 3078
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
DOIs
Publication statusPublished - 6 Dec 2023

Fingerprint

Dive into the research topics of 'A Hybrid Neural Coding Approach for Pattern Recognition With Spiking Neural Networks'. Together they form a unique fingerprint.

Cite this