HiveBrain v1.2.0
Get Started
← Back to all entries
patternModerate

Why larger input sizes imply harder instances?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
implywhylargerinstancesharderinputsizes

Problem

Below, assume we're working with an infinite-tape Turing machine.

When explaining the notion of time complexity to someone, and why it is measured relative to the input size of an instance, I stumbled across the following claim:


[..] For example, it's natural that you'd need more steps to multiply two integers with 100000 bits, than, say multiplying two integers with 3 bits.

The claim is convincing, but somehow hand-waving. In all algorithms I came across, the larger the input size, the more steps you need. In more precise words, the time complexity is a monotonically increasing function of the input size.


Is it the case that time complexity is always an increasing function in the input size? If so, why is it the case? Is there a proof for that beyond hand-waving?

Solution

Is it the case that time complexity is always an increasing function
in the input size? If so, why is it the case?

No. Consider a Turing machine that halts after $n$ steps when the input size $n$ is even, and halts after $n^2$ steps when $n$ is odd.

If you mean the complexity of a problem, the answer is still no. The complexity of primality testing is much smaller for even numbers than for odd numbers.

Context

StackExchange Computer Science Q#3161, answer score: 13

Revisions (0)

No revisions yet.