An alternative to entropy in the measurement of information

Research output: Contribution to journalArticlepeer-review

33 Citations (Scopus)


Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find more general measure of information, but their outcomes were more of formal, theoretical interest, and neither has provided better insight into the nature of information. The strengths of entropy seemed so obvious that no much effort has been made to find an alternative to entropy which gives different values, but which is consistent with entropy in the sense that the results obtained in information theory thus far can be reproduced with the new measure. In this article the need for such an alternative measure is demonstrated based on historical review of the problems with conceptualization of information. Then, an alternative measure is presented in the context of modified definition of information applicable outside of the conduit metaphor of Shannon's approach, and formulated without reference to uncertainty. It has several features superior to those of entropy. For instance, unlike entropy it can be easily and consistently extended to the continuous probability distributions, and unlike differential entropy this extension is always positive and invariant with respect to linear transformations of coordinates.

Original languageEnglish
Pages (from-to)388-412
Number of pages25
Issue number5
Publication statusPublished - 2004 Dec
Externally publishedYes


  • Entropy
  • Information Theory
  • Measures of Information
  • Semantics of Information

ASJC Scopus subject areas

  • Information Systems
  • Mathematical Physics
  • Physics and Astronomy (miscellaneous)
  • Electrical and Electronic Engineering


Dive into the research topics of 'An alternative to entropy in the measurement of information'. Together they form a unique fingerprint.

Cite this