A Characterization of Entropy in Terms of Information Loss

John C. Baez, Tobias Fritz, Tom Leinster

Research output: Contribution to journalArticlepeer-review

Abstract

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the "information loss", or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.

Original languageEnglish
Pages (from-to)1945-1957
Number of pages13
JournalEntropy
Volume13
Issue number11
DOIs
Publication statusPublished - Nov 2011

Keywords / Materials (for Non-textual outputs)

  • Shannon entropy
  • Tsallis entropy
  • information theory
  • measure-preserving function

Fingerprint

Dive into the research topics of 'A Characterization of Entropy in Terms of Information Loss'. Together they form a unique fingerprint.

Cite this