An optimal entropy estimator for discrete random variables

Motoki Shiga, Yasunari Yokota

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

This paper presents analytical formulations of the most important estimation errors - averaged squared bias error and mean squared error - for the class of entropy estimator expressed as a sum of single variable functions. The class of entropy estimator includes almost all important entropy estimators that have been proposed heretofore. Furthermore, this paper presents an optimal entropy estimator that can minimize mean squared error of the estimate under the condition that averaged squared bias error of the estimate is restricted to below an arbitrary value. A numerical experiment demonstrates that the proposed entropy estimator provides a lower mean squared error than conventional entropy estimators when entropy is estimated as an ensemble mean over plural entropy estimates obtained for different independent data. Such estimation is often utilized for biological signals, e.g., neural signals, because of biological tiredness and adaptation property.

Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks, IJCNN 2005
Pages1280-1285
Number of pages6
DOIs
Publication statusPublished - 2005
Externally publishedYes
EventInternational Joint Conference on Neural Networks, IJCNN 2005 - Montreal, QC, Canada
Duration: 2005 Jul 312005 Aug 4

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2

Other

OtherInternational Joint Conference on Neural Networks, IJCNN 2005
Country/TerritoryCanada
CityMontreal, QC
Period05/7/3105/8/4

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'An optimal entropy estimator for discrete random variables'. Together they form a unique fingerprint.

Cite this