HiveBrain v1.2.0
Get Started
← Back to all entries
patternMinor

Standard information-theoretic lower bound?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
theoreticboundlowerstandardinformation

Problem

There should be a simple argument, but I'm struggling to see it.

Suppose Alice has a string $x \in \{0, 1\}^n$ and sends a message $s = s(x)$ to Bob. And suppose that given $s$, Bob can reconstruct each bit of $x$ with probability $2/3$ (say). (Note that this is weaker than the claim "with probability $2/3$, Bob can recover the entire string $x$." A clean proof of this case would be appreciated too.)

Then we should be able to conclude $|s| = \Omega(n)$, and this seems obvious, but can someone formalize this for me?

Solution

I understand your question this way:


Assume Alice has some $X\in\{0,1\}^n$. Alice encodes (compresses) $X$ into $s = s(X)$, where $|s|\le n$. Bob decodes $s$ to obtain $X'=dec(s)$.
Show that unless $|s|=\Omega(n)$ it cannot hold that Hamming$(X,X') < n/3$.

If this is what you meant, then this follows from the fact that most strings cannot be compressed. Assume that for any $X$, $|s(X)|=o(n)$. Note that there are at most $2^{|s|}\ll 2^n$ such strings. More accurately, $2^{|s|} / 2^n \to_{n\to\infty} 0$.

Then, for each possible $s$ construct $dec(s)$ and define $Y_s$ to be a Hamming ball of radius $n/3$ centered at $dec(s)$. Verify that $|Y_s| = \sum_{d=0}^{n/3}\binom{n}{d} < 2^{n\cdot h(1/3)}$, with $h(\cdot)$ the binary entropy function.
Thus $\left|\bigcup_s |Y_s|\right| \le 2^{o(n)}2^{0.92n} \ll 2^n$. This means that there exists a string $X$ not covered by the $Y_s$ for any $s$, i.e., not within a Hamming-distance of $n/3$ from any of the decoding of any $s$.

If this is not what you meant, you will have to formulate your statement in a proper way. For instance, you say "with probability 2/3", but there is no probability space defined in your statement, etc.

Context

StackExchange Computer Science Q#117107, answer score: 3

Revisions (0)

No revisions yet.