Computational Efficiency of a Modular Reservoir Network for Image Recognition

Research output: Contribution to journalArticlepeer-review

Abstract

Liquid state machine (LSM) is a type of recurrent spiking network with a strong relationship to neurophysiology and has achieved great success in time series processing. However, the computational cost of simulations and complex dynamics with time dependency limit the size and functionality of LSMs. This paper presents a large-scale bioinspired LSM with modular topology. We integrate the findings on the visual cortex that specifically designed input synapses can fit the activation of the real cortex and perform the Hough transform, a feature extraction algorithm used in digital image processing, without additional cost. We experimentally verify that such a combination can significantly improve the network functionality. The network performance is evaluated using the MNIST dataset where the image data are encoded into spiking series by Poisson coding. We show that the proposed structure can not only significantly reduce the computational complexity but also achieve higher performance compared to the structure of previous reported networks of a similar size. We also show that the proposed structure has better robustness against system damage than the small-world and random structures. We believe that the proposed computationally efficient method can greatly contribute to future applications of reservoir computing.

Original languageEnglish
Article number594337
JournalFrontiers in Computational Neuroscience
Volume15
DOIs
Publication statusPublished - 2021 Feb 5

Keywords

  • Hough transform
  • liquid state machine
  • pattern recognition
  • reservoir computing
  • robustness

ASJC Scopus subject areas

  • Neuroscience (miscellaneous)
  • Cellular and Molecular Neuroscience

Fingerprint Dive into the research topics of 'Computational Efficiency of a Modular Reservoir Network for Image Recognition'. Together they form a unique fingerprint.

Cite this