patternMinor
Runtime of Euclidean Algorithm
Viewed 0 times
algorithmruntimeeuclidean
Problem
Given two $n$-bits numbers $a$ and $b$, I am not sure on how to find the runtime of the euclidean algorithm for finding the $\gcd$ of $a,b$. The problem (for me) in here is that apart from the size of $a$ and $b$, I don't feel like I have any other information that will help me to know what is the runtime of the algorithm. I saw somewhere that the number of rounds is $\log_2 a = n$, but, assuming this is correct, I have no intuition as for why it is such?
My question is for an explanation to how much time it takes to calculate the $gcd$ (as a function of $n$)?
Is it linear in $n$?
My question is for an explanation to how much time it takes to calculate the $gcd$ (as a function of $n$)?
Is it linear in $n$?
Solution
Let's analyze two consecutive steps of the GCD algorithm: given $a > b$, we compute $c = a \pmod{b}$ (replacing $a$) and $d = b \pmod{c}$ (replacing $b$). The two new numbers $c,d$ we are left with satisfy
$$ c+d \leq b \leq \frac{1}{2} (a+b). $$
So the sum of the two numbers decreases by a factor of at least a $1/2$ every two iterations. If $a,b$ are at most $n$-bit long, then at the start of the algorithm, $a+b \leq 2^{n+1}$. After $2(n+1)$ steps, the two remaining numbers have sum at most $1$, so by that time the algorithm must have terminated.
$$ c+d \leq b \leq \frac{1}{2} (a+b). $$
So the sum of the two numbers decreases by a factor of at least a $1/2$ every two iterations. If $a,b$ are at most $n$-bit long, then at the start of the algorithm, $a+b \leq 2^{n+1}$. After $2(n+1)$ steps, the two remaining numbers have sum at most $1$, so by that time the algorithm must have terminated.
Context
StackExchange Computer Science Q#33344, answer score: 4
Revisions (0)
No revisions yet.