HiveBrain v1.2.0
Get Started
← Back to all entries
patternModerate

Chaitin's constant is normal?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
chaitinconstantnormal

Problem

According to this source, Chaitin's constant $\Omega$ is normal.


Each halting probability is a normal and transcendental real number that is not computable, which means that there is no algorithm to compute its digits. Indeed, each halting probability is Martin-Löf random, meaning there is not even any algorithm which can reliably guess its digits.

Source (Wikipedia)

Furthermore, the definition of normal is that each digit occurs with equal probability $1/b$. And that each duets of digits occur with $1/b^2$ probability and every triplets occurs with probability $1/b^3$ and so on.

Chaitin's omega is calculated via

$\Omega = \sum_{p \in halts} 2^{-|p|}$

Writing $\Omega$ in binary, we obtain a list of 0 and 1. For example,

2^-1=0.1 +
2^-2=0.01 +
2^-3=0.001 +
~skip 2^-4 as it does not halt
2^-5=0.00001 +
...
=\Omega
=0.11101...


Clearly, we can see that the position of each bits corresponds to the halting state of the program of length corresponding to the bit.

Here is what I am struggling with

If $\Omega$ is indeed normal, then it means that exactly 50% of programs halt and exactly 50% do not. This seems very counter intuitive.

For example, suppose I generate java programs by randomly concatenating single characters. The majority of them, I would guess more than 99.99% would not even compile. Would this not imply that at least 99.99% of them will not halt? How do we justify that exactly half will halt and exactly half will not, by virtue of $\Omega$ being normal.

Or is wikipedia incorrect about $\Omega$ being normal?

Solution

In contrast to your example, Chaitin's constant is not defined as follows:

$$ \Omega = \sum_{n\colon \text{$n$th program halts}} 2^{-n}. $$

Instead, there is a set $\Pi \subseteq \{0,1\}^*$ of allowed programs which is prefix-free (no string is a prefix of another string). Each of the programs in $\Pi$ is legal (this negates your Java example). If the programs are encoding in unary then it is indeed the case that the $n$th program has length $n$, and then your definition of $\Omega$ works. But for other encodings, the definition of $\Omega$ is

$$ \Omega = \sum_{ p \in \Pi\colon p \text{ halts} } 2^{-|p|}, $$
where $|p|$ is the length of the binary string $p$. Kraft's inequality shows that $\sum_{p \in \Pi} 2^{-|p|} \leq 1$.

Chaitin's constant is algorithmically random: the (prefix) Kolmogorov complexity of the first $n$ bits is $n - O(1)$. To show this, note first that $\Omega_n$, the first $n$ bits of $\Omega$, suffice to determine whether a program of length $n$ (under the encoding $\Pi$) halts or not. Indeed, as a fraction, $\Omega_n \leq \Omega 0$ and infinitely many $n$, you could compute $\Omega_n$ using $n - K$ bits. For each such $n$, you can find a string whose Kolmogorov complexity is larger than $n$, by considering the output of all halting programs of length at most $n$. For large enough $K$, the result is a program of length at most $n$ for computing the a string whose Kolmogorov complexity is more than $n$. This contradiction proves that for some $K$, the Kolmogorov complexity of $\Omega_n$ is at least $n-K$.

Algorithmic randomness of $\Omega$ implies, in particular, that the frequency of 0s and 1s in its binary expansion tends to 1/2. Indeed, suppose that for some (rational) $\epsilon > 0$ there exist infinitely many $n$ such that the fraction of 1s in $\Omega_n$ is at most $1/2-\epsilon$. Since there are only at most $2^{h(1/2-\epsilon)n}$ strings with at most $1/2-\epsilon$ many 1s, we can compress $\Omega_n$ to size $h(1/2-\epsilon)n + 2\log n + C_\epsilon$ (the constant $C_\epsilon$ depends on $\epsilon$ since the program needs to know $\epsilon$). However, this is $n - \omega(1)$, contradicting the algorithmic randomness of $\Omega$.

Context

StackExchange Computer Science Q#67695, answer score: 10

Revisions (0)

No revisions yet.