The maximum entropy of a metric space

Tom Leinster, Emily Roff

Research output: Contribution to journalArticlepeer-review


We define a one-parameter family of entropies, each assigning a real number to any probability measure on a compact metric space (or, more generally, a compact Hausdorff space with a notion of similarity between points). These entropies generalise the Shannon and Rényi entropies of information theory.
We prove that on any space X, there is a single probability measure maximising all these entropies simultaneously. Moreover, all the entropies have the same maximum value: the maximum entropy of X. As X is scaled up, the maximum entropy grows; its asymptotics determine geometric information about X, including the volume and dimension. We also study the large-scale limit of the maximising measure itself, arguing that it should be regarded as the canonical or uniform measure on X.
Primarily we work not with entropy itself but its exponential, called diversity and (in its finite form) used as a measure of biodiversity. Our main theorem was first proved in the finite case by Leinster and Meckes.
Original languageEnglish
Number of pages43
JournalThe Quarterly Journal of Mathematics
Publication statusAccepted/In press - 8 Dec 2020


Dive into the research topics of 'The maximum entropy of a metric space'. Together they form a unique fingerprint.

Cite this