snippetMinor
How to show that in noiseless coding theorem, the bound $\mathrm{MinACL}<H(P)+1$ is tight?
Viewed 0 times
showtheboundmathrmminacltightthattheoremnoiselesshow
Problem
The theorem states that
$$
H(P)\leq\mathrm{MinACL}(P)0$, there is a probability distribution $P$ s.t. $\mathrm{MinACL}(P) - H(P)\geq1-\epsilon$?
(I was given a hint that I can start with a source s.t. $H(P)=\mathrm{MinACL}(P)$ and try to change the probabilities in order to skew the code.}
$$
H(P)\leq\mathrm{MinACL}(P)0$, there is a probability distribution $P$ s.t. $\mathrm{MinACL}(P) - H(P)\geq1-\epsilon$?
(I was given a hint that I can start with a source s.t. $H(P)=\mathrm{MinACL}(P)$ and try to change the probabilities in order to skew the code.}
Solution
Em…… I seem to have figured out how to construct a probability distribution to achieve $\mathrm{MinACL}$ as close to $H(P)+1$ as possible. Suppose, for the information source $(S,P)$ with $|S| = 2^l + 1$, $p_0 = 1 - \epsilon$, $p_i = \epsilon/2^l,1\leq i\leq2^l$, a valid Huffman tree could be constructed as follows: First build a full binary tree with $s_i,1\leq i\leq2^l$, as leaves and then make $s_0$ sibling of the root of this binary tree, and child of the Huffman tree root node. This scheme can be justified easily given $\epsilon\in(0,0.5)$. Then, $H(P) = -(1-\epsilon)\log(1-\epsilon)-(\epsilon/2^l)\log(\epsilon/2^l)$ and $\mathrm{MinACL}(P) = (l+1)\cdot\epsilon+1\cdot(1-\epsilon)$.
$$
\lim_{\epsilon\to0}\mathrm{MinACL}(P)-H(P) = 1
$$
(I haven't calculated the limit, but the graph below if of much evidence:
)
$$
\lim_{\epsilon\to0}\mathrm{MinACL}(P)-H(P) = 1
$$
(I haven't calculated the limit, but the graph below if of much evidence:
)
Context
StackExchange Computer Science Q#104037, answer score: 2
Revisions (0)
No revisions yet.