Abstract
This letter summarizes and proves the concept of bounded-input bounded-state (BIBS) stability for weight convergence of a broad family of in-parameter-linear nonlinear neural architectures (IPLNAs) as it generally applies to a broad family of incremental gradient learning algorithms. A practical BIBS convergence condition results from the derived proofs for every individual learning point or batches for real-time applications.
Original language | English |
---|---|
Journal | IEEE Transactions on Neural Networks and Learning Systems |
DOIs | |
Publication status | Accepted/In press - 2021 |
Keywords
- Bounded-input bounded-state stability (BIBS)
- Convergence
- Kernel
- Learning systems
- Loss measurement
- Mechanical engineering
- Neural networks
- Stability criteria
- extreme learning machines
- in-parameter-linear nonlinear neural architectures (IPLNAs)
- incremental gradient learnings
- input-to-state stability (ISS)
- polynomial neural networks
- random vector functional link networks
- weight convergence.
ASJC Scopus subject areas
- Software
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence