This is from the number theory chapter in CLRS.
We are asked to prove that binary "paper and pencil" long division a/b
with result q
and reminder r
does O((1+lgq)lgb)
operations on bits.
The way I see it is we do 1 subtraction of b
for each bit in q
. So assuming that subtracting b
does lgb
operations (one for each bit in b
), then we have a total of O(lgblgq)
operations, which is not what is requested.
If you take into account that the first subtraction operation you do might result in a 0 bit (for example, dividing 100b by 11b), then, OK, you can add 1 to lgq
to compensate for this subtraction. But... the same could be said of the subtraction itself - it can take lgb
operations or it can take lg(b)+1
operations depending on the numbers (in the 100b and 11b example, the second subtraction will be 100b-11b which takes 3 operations to complete).
So if we're factoring these cases, then the number of operations should be O((1+lgb)(1+lgq))
.
So the question is, how can you show that the division takes O((1+lgq)lgb)
operations?
When you subtract
100b-11b
, you can actually ignore the leading bit in the first number, because you already know that the corresponding bit in the result will be 0. Had it been 1, you would have done a subtraction instead of a shift in the previous step. So the subtraction always considers exactlylg b
bits.