An Approach to Stable Gradient-Descent Adaptation of Higher Order Neural Units

Ivo Bukovsky, Noriyasu Homma

Research output: Contribution to journalArticle

6 Citations (Scopus)

Abstract

Stability evaluation of a weight-update system of higher order neural units (HONUs) with polynomial aggregation of neural inputs (also known as classes of polynomial neural networks) for adaptation of both feedforward and recurrent HONUs by a gradient descent method is introduced. An essential core of the approach is based on the spectral radius of a weight-update system, and it allows stability monitoring and its maintenance at every adaptation step individually. Assuring the stability of the weight-update system (at every single adaptation step) naturally results in the adaptation stability of the whole neural architecture that adapts to the target data. As an aside, the used approach highlights the fact that the weight optimization of HONU is a linear problem, so the proposed approach can be generally extended to any neural architecture that is linear in its adaptable parameters.

Original languageEnglish
Article number7487017
Pages (from-to)2022-2034
Number of pages13
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume28
Issue number9
DOIs
Publication statusPublished - 2017 Sep

Keywords

  • Gradient descent (GD)
  • higher order neural unit (HONU)
  • polynomial neural network
  • spectral radius
  • stability

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'An Approach to Stable Gradient-Descent Adaptation of Higher Order Neural Units'. Together they form a unique fingerprint.

  • Cite this