FPGA implementation of binarized perceptron learning hardware using CMOS invertible logic

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper introduces FPGA implementation of learning hardware for a neural network. The proposed learning hardware is designed using CMOS invertible logic that realizes probabilistic bidirectional (forward and backward) operations with basic CMOS logic gates. The backward operation based on CMOS invertible logic makes hardware-based learning possible because the loss function is not required. For a simple case study, the proposed learning hardware trains using simplified a MNIST data set for a 25-input binarized perceptron. Our FPGA implementation on Digilent Genesys 2 achieves around 100 x faster operating speed than that using a traditional learning algorithm on software while maintaining the same recognition accuracy of 99%.

Original languageEnglish
Title of host publication2019 26th IEEE International Conference on Electronics, Circuits and Systems, ICECS 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages115-116
Number of pages2
ISBN (Electronic)9781728109961
DOIs
Publication statusPublished - 2019 Nov
Event26th IEEE International Conference on Electronics, Circuits and Systems, ICECS 2019 - Genoa, Italy
Duration: 2019 Nov 272019 Nov 29

Publication series

Name2019 26th IEEE International Conference on Electronics, Circuits and Systems, ICECS 2019

Conference

Conference26th IEEE International Conference on Electronics, Circuits and Systems, ICECS 2019
CountryItaly
CityGenoa
Period19/11/2719/11/29

Keywords

  • Hamiltonian
  • Spin gate
  • Stochastic computing

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Control and Optimization
  • Computer Networks and Communications
  • Hardware and Architecture

Fingerprint Dive into the research topics of 'FPGA implementation of binarized perceptron learning hardware using CMOS invertible logic'. Together they form a unique fingerprint.

Cite this