The Lieb conjecture on the monotonicity of Shannon’s entropy is true

You need to know: Basic probability theory: random variable, its support, probability density function, expectation (denoted E[.]), independent random variables, identically distributed random variables.

Background: For a random variable X with probability density function f and support S\subset{\mathbb R}, its Shannon entropy is \text{Ent}(X) = -\int_S f(x) \log f(x) dx. Random variable X is square-integrable if E[X^2]<\infty.

The Theorem: On 4th September 2003, Shiri Artstein, Keith Ball, Franck Barthe, and Assaf Naor submitted to the Journal of the AMS a paper in which they proved that, for any sequence X_1, X_2, \dots of independent and identically distributed square-integrable random variables, the entropy of the normalised sum \text{Ent} \left(\frac{X_1 + \dots + X_n}{\sqrt{n}}\right) is a non-decreasing function of n.

Short context: For sequence X_1, X_2, \dots as above, it is known that normalised sums Y_n = \frac{1}{\sqrt{n}}\sum\limits_{i=1}^n X_i converge to normal distribution. In 1949 Shannon proved that \text{Ent}(Y_2) \geq \text{Ent}(Y_1). In 1978, Lieb conjectured that in fact \text{Ent}(Y_{n+1}) \geq \text{Ent}(Y_n) for all n, which could be interpreted that Y_n becomes closer and closer to the normal distribution (the one with maximal entropy) at every step. This conjecture was open even for n=2. The Theorems proves it in full.

Links: The original paper is available here.

Go to the list of all theorems

Leave a comment