Abstract
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the "information loss", or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.
Original language | English |
---|---|
Pages (from-to) | 1945-1957 |
Number of pages | 13 |
Journal | Entropy |
Volume | 13 |
Issue number | 11 |
DOIs | |
Publication status | Published - Nov 2011 |
Keywords / Materials (for Non-textual outputs)
- Shannon entropy
- Tsallis entropy
- information theory
- measure-preserving function