HiveBrain v1.2.0
Get Started
← Back to all entries
snippetMinor

How to measure the complexity of the discrete logarithm problem?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
problemthediscretehowmeasurecomplexitylogarithm

Problem

The answers to this question on Crypto Stack Exchange basically says that, to measure the complexity of the logarithm problem, we have to take the length of the number representing the size of the group into account. It seems arbitrary, why don't we chose the size of the group as the argument? Is there a criterion to know what argument to chose? In fact, I know I overlooked something important since the complexity changes hugely if we do it by the size of the group.

Solution

It doesn't matter whether you choose the size of the group $|G|$ or the size of the integer representing it $n$ as a parameter, since $n \approx \log |G|$. There are two reasons that usually the complexity is described in terms of $n$ rather than $|G|$:

-
$n$ is the length of the input (more accurately, the input has length $\Theta(n)$), and we usually measure the complexity of algorithms as a function of the input length.

-
Usually $n$ is a small number such as $1024$, whereas $|G|$ is a huge number such as (roughly) $2^{1024}$.

Context

StackExchange Computer Science Q#64961, answer score: 5

Revisions (0)

No revisions yet.