Learning entropy as a learning-based information concept

Ivo Bukovsky, Witold Kinsner, Noriyasu Homma

研究成果: Article査読

7 被引用数 (Scopus)

抄録

Recently, a novel concept of a non-probabilistic novelty detection measure, based on a multi-scale quantification of unusually large learning efforts of machine learning systems, was introduced as learning entropy (LE). The key finding with LE is that the learning effort of learning systems is quantifiable as a novelty measure for each individually observed data point of otherwise complex dynamic systems, while the model accuracy is not a necessary requirement for novelty detection. This brief paper extends the explanation of LE from the point of an informatics approach towards a cognitive (learning-based) information measure emphasizing the distinction from Shannon's concept of probabilistic information. Fundamental derivations of learning entropy and of its practical estimations are recalled and further extended. The potentials, limitations, and, thus, the current challenges of LE are discussed.

本文言語English
論文番号166
ジャーナルEntropy
21
2
DOI
出版ステータスPublished - 2019 2月 1

ASJC Scopus subject areas

  • 物理学および天文学(全般)

フィンガープリント

「Learning entropy as a learning-based information concept」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル