HiveBrain v1.2.0
Get Started
← Back to all entries
patternCritical

Why can we assume an algorithm can be represented as a bit string?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
assumerepresentedwhycanbitalgorithmstring

Problem

I am starting read a book about Computational Complexity and Turing Machines. Here is quote:


An algorithm (i.e., a machine) can be represented as a bit string once
we decide on some canonical encoding.

This assertion is provided as a simple fact, but I can't understand it.

For example, if I have an algorithm which takes $x$ as input and computes $(x+1)^2$ or:

int function (int x){
   x = x + 1; 
   return x**2; 
}


How that can this be represented as string using alphabet $\{0, 1\}^*$?

Solution

You already have a representation of that function as text. Convert each character to a one-byte value using the ASCII encoding. Then the result is a sequence of bytes, i.e., a sequence of bits, i.e., a string over the alphabet $\{0,1\}^*$. That's one example encoding.

Context

StackExchange Computer Science Q#90259, answer score: 52

Revisions (0)

No revisions yet.