抄録
If the conventional quadratic error measure is used, the learning process of the error back-propagation algorithm often fails in metastable states and the learning process suffers serious inefficiency. If the Kullback-Leibler divergence is used, it can be shown numerically that the most typical metastable states are removed and that the learning efficiency is improved significantly as the number of hidden neurons is increased. This means that the Kullback-Leibler divergence is a superior error measure for the error back-propagation algorithm with scalability for the redundancy of hidden neurons. This scalability has a great advantage in applications since we can simply provide a network with large enough size for a given problem, and we need not know its optimal network size in advance.
本文言語 | English |
---|---|
ページ(範囲) | 1091-1095 |
ページ数 | 5 |
ジャーナル | Journal of the Korean Physical Society |
巻 | 40 |
号 | 6 |
出版ステータス | Published - 2002 6月 |
ASJC Scopus subject areas
- 物理学および天文学(全般)