patternMajor
Cardinality of the set of algorithms
Viewed 0 times
setthecardinalityalgorithms
Problem
Someone in a discussion brought up that (he reckons) there can be at least continuum number of strategies to approach a specific problem. The specific problem was trading strategies (not algorithms but strategies) but I think thats beside the point for my question.
This got me thinking about the cardinality of the set of algorithms. I have been searching around a bit but have come up with nothing. I've been thinking that, since turing machines operate with a finite set of alphabet and the tape has to be indexable thus countable, it's impossible to have uncountable number of algorithms. My set theory is admittedly rusty so I am not certain at all my reasoning is valid and I probably wouldn't be able to prove it, but it's an interesting thought.
What is the cardinality of the set of algorithms?
This got me thinking about the cardinality of the set of algorithms. I have been searching around a bit but have come up with nothing. I've been thinking that, since turing machines operate with a finite set of alphabet and the tape has to be indexable thus countable, it's impossible to have uncountable number of algorithms. My set theory is admittedly rusty so I am not certain at all my reasoning is valid and I probably wouldn't be able to prove it, but it's an interesting thought.
What is the cardinality of the set of algorithms?
Solution
An algorithm is informally described as a finite sequence of written instructions for accomplishing some task. More formally, they're identified as Turing machines, though you could equally well describe them as computer programs.
The precise formalism you use doesn't much matter but the fundamental point is that each algorithm can be written down as a finite sequence of characters, where the characters are chosen from some finite set, e.g., roman letters, ASCII or zeroes and ones. For simplicity, let's assume zeroes and ones. Any sequence of zeroes and ones is just a natural number written in binary. That means there are at most a countable infinity of algorithms, since every algorithm can be represented as a natural number.
For full credit, you should be worried that some natural numbers might not code valid programs, so there might be fewer algorithms than natural numbers. (For bonus credit, you might also be wondering if it's possible that two different natural numbers represent the same algorithm.) However,
So we conclude that the set of algorithms is countably infinite.
The precise formalism you use doesn't much matter but the fundamental point is that each algorithm can be written down as a finite sequence of characters, where the characters are chosen from some finite set, e.g., roman letters, ASCII or zeroes and ones. For simplicity, let's assume zeroes and ones. Any sequence of zeroes and ones is just a natural number written in binary. That means there are at most a countable infinity of algorithms, since every algorithm can be represented as a natural number.
For full credit, you should be worried that some natural numbers might not code valid programs, so there might be fewer algorithms than natural numbers. (For bonus credit, you might also be wondering if it's possible that two different natural numbers represent the same algorithm.) However,
print 1, print 2, print 3 and so on are all algorithms and all different, so there are at least countably infinitely many algorithms.So we conclude that the set of algorithms is countably infinite.
Context
StackExchange Computer Science Q#96679, answer score: 28
Revisions (0)
No revisions yet.