You need to know: Basic probability theory: random variable, its support, probability density function, expectation (denoted ), independent random variables, identically distributed random variables.
Background: For a random variable X with probability density function f and support , its Shannon entropy is
. Random variable X is square-integrable if
.
The Theorem: On 4th September 2003, Shiri Artstein, Keith Ball, Franck Barthe, and Assaf Naor submitted to the Journal of the AMS a paper in which they proved that, for any sequence of independent and identically distributed square-integrable random variables, the entropy of the normalised sum
is a non-decreasing function of n.
Short context: For sequence as above, it is known that normalised sums
converge to normal distribution. In 1949 Shannon proved that
. In 1978, Lieb conjectured that in fact
for all n, which could be interpreted that
becomes closer and closer to the normal distribution (the one with maximal entropy) at every step. This conjecture was open even for
. The Theorems proves it in full.
Links: The original paper is available here.