HiveBrain v1.2.0
Get Started
← Back to all entries
patternMinor

Shannon Entropy to Min-Entropy

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
shannonentropymin

Problem

In many papers I've read that it is well known that the Shannon entropy of a random variable can be converted to min-entropy (up to small statistical distance) by taking independent copies of the variable.
Can anyone explain to me what exactly this means?

Solution

This is a consequence of the asymptotic equipartition property (AEP), which is a form of the law of large numbers. The AEP states that if a random variable has (binary) entropy $H$ and you take $n$ copies of it, then most of the data points have probability roughly $2^{-nH}$. This is only true for most data points, which is the source of the small statistical distance that you mention.

As an example, consider a $p$-biased coin. If you throw $n$ coins with bias $p$, then most likely you will get roughly $pn$ heads, each of which events has probability roughly
$$ p^{pn} (1-p)^{(1-p)n} = 2^{-nh(p)} .$$

Context

StackExchange Computer Science Q#10862, answer score: 6

Revisions (0)

No revisions yet.