Calculating the Recurrence Relation T(n)=T(n / log n) + Θ(1)

2.2k views Asked by At

The question comes from Introduction to Algorithms 3rd Edition, P63, Problem 3-6, where it's introduced as Iterated functions. I rewrite it as follows:

int T(int n){
   for(int count = 0; n > 2 ; ++count)
   {
      n = n/log₂(n); 
   }
   return count;
}

Then give as tight a bound as possible on T(n).

I can make it O(log n) and Ω(log n / log log n), but can it be tighter?


PS: Using Mathematica, I've learned that when n=1*10^3281039, T(n)=500000

and the same time, T(n)=1.072435*log n/ log log n

and the coefficient declines with n from 1.22943 (n = 2.07126*10^235) to 1.072435 (n = 1*10^3281039).

May this information be helpful.

3

There are 3 answers

1
AbcAeffchen On BEST ANSWER

It looks like the lower bound is pretty good, so I tried to proof that the upper bound is O(log n / log log n). But let me first explain the other bounds (just for a better understanding).

TL;DR

T(n) is in Θ(log n / log log n).

T(n) is in O(log n)

This can be seen by modifying n := n/log₂n to n := n/2.
It needs O(log₂ n) steps until n ≤ 2 holds.

T(n) is in Ω(log n / log log n)

This can be seen by modifying n := n/log₂(n) to n := n/m, where m is the initial value of log n.
Solving the equation n / (log n)x < 2 for x leads us to

               log n - x log log n < log 2
    ⇔                log n - log 2 < x log log n
    ⇔  (log n - log 2) / log log n < x

    ⇒ x ∈ Ω(log n / log log n)

Improving the upper bound: O(log n) → O(log n / log log n)

Now let us try to improve the upper bound. Instead of dividing n by a fixed constant (namely 2 in the above proof) we divide n as long by the initial value of log(n)/2 as the current value of log(n) is bigger. To be more clearer have a look at the modified code:

int T₂(int n){
     n_old = n;
     for(int count=0; n>2 ;++count)
     {
         n = n / (log₂(n_old)/2);

         if(log₂(n)) <= log₂(n_old)/2)
         {
            n_old = n;
         }
     }
     return count;
}

The complexity of the function T₂ is clearly an upper bound for the function T, since log₂(n_old)/2 < log₂(n) holds for the whole time.

Now we need to know how many times we divide by each 1/2⋅log(n_old):

    n / (log(sqrt(n)))x ≤ sqrt(n)
⇔           n / sqrt(n) ≤ log(sqrt(n))x
⇔          log(sqrt(n)) ≤ x log(log(sqrt(n)))

⇔    log(sqrt(n)) / log(log(sqrt(n)))  ≤ x 

So we get the recurrence formula T₂(n) = T(sqrt(n)) + O(log(sqrt(n)) / log(log(sqrt(n)))).

Now we need to know how often this formula has to be expanded until n < 2 holds.

        n2-x < 2
⇔       2-x⋅log n < log 2
⇔       -x log 2 + log log n < log 2
⇔       log log n < log 2 + x log 2
⇔       log log n < (x + 1) log 2

So we need to expand the formula about log log n times.

Now it gets a little bit harder. (Have also a look at the Mike_Dog's answer)

T₂(n) = T(sqrt(n)) + log(sqrt(n)) / log(log(sqrt(n)))
      = Σk=1,...,log log n - 1 2-k⋅log(n) / log(2-k⋅log n))
      = log(n) ⋅ Σk=1,...,log log n - 1 2-k / (-k + log log n))
(1)   = log(n) ⋅ Σk=1,...,log log n - 1 2k - log log n / k
      = log(n) ⋅ Σk=1,...,log log n - 1 2k ⋅ 2- log log n / k
      = log(n) ⋅ Σk=1,...,log log n - 1 2k / (k ⋅ log n)
      = Σk=1,...,log log n - 1 2k / k

In the line marked with (1) I reordered the sum.

So, at the end we "only" have to calculate Σk=1,...,t 2k / k for t = log log n - 1. At this point Maple solves this to

Σk=1,...,t 2k / k = -I⋅π - 2t⋅LerchPhi(2, 1, t)  +2t/t

where I is the imaginary unit and LerchPhi is the Lerch transcendent. Since the result for the sum above is a real number for all relevant cases, we can just ignore all imaginary parts. The Lerch transcendent LerchPhi(2,1,t) seems to be in O(-1/t), but I'm not 100% sure about it. Maybe someone will prove this.

Finally this results in

T₂(n) = -2t⋅O(-1/t) + 2t/t = O(2t/t) = O(log n / log log n)

All together we have T(n) ∈ Ω(log n / log log n) and T(n) ∈ O(log n/ log log n),
so T(n) ∈ Θ(log n/ log log n) holds. This result is also supported by your example data.

I hope this is understandable and it helps a little.

1
Mike_Dog On

Thanks for the answer of @AbcAeffchen

I'm the owner of the question, using the knowledge of "the master method" I learned yesterday, the "a little bit harder" part of proof can be done as follows simply. The master method

I will start here:

T(n) = T(sqrt(n)) + O(log(sqrt(n)) / log(log(sqrt(n))))
⇔ T(n)=T(sqrt(n)) + O(log n  / log log n)

Let

n=2k , S(k)=T(2k)

then we have

T(2k) =T(2k/2) + O(log 2k / log log 2k) ⇔ S(k) =S(k/2) + O( k/log k)

with the master method

S(k)=a*S(k/b)+f(k), where a=1, b=2, f(k)=k/log k = Ω(klog21 +ε) = Ω(kε),

as long as ε∈(0,1)

so we can apply case 3. Then

S(k) = O(k/log k)

T(n) = S(k) = O(k/log k) = O(log n/ log log n)

0
AudioBubble On

The guts of the problem of verifying the conjectured estimate is to get a good estimate of plugging the value

n / log(n)

into the function

n --> log(n) / log(log(n))

Theorem:

log( n/log(n) ) / log(log( n/log(n) )) = log(n)/log(log(n)) - 1 + o(1)

(in case of font readibility issues, that's little-oh, not big-oh)

Proof:

To save on notation, write

A = n
B = log(n)
C = log(log(n))

The work is based on the first-order approximation to the (natural) logarithm: when 0 < y < x,

log(x) - y/x < log(x - y) < log(x)

The value we're trying to estimate is

log(A/B) / log(log(A/B)) = (B - C) / log(B - C)

Applying the bounds for the logarithm of a difference gives

(B-C) / log(B) < (B-C) / log(B-C) < (B-C) / (log(B) - C/B)

that is,

(B-C) / C < (B-C) / log(B-C) < (B-C)B / (C (B-1))

Both the recursion we're trying to satisfy and the lower bound suggest we should estimate this with B/C - 1. Pulling that off of both sides gives

B/C - 1 < (B-C) / log(B-C) < B/C - 1 + (B-C)/(C(B-1))

and thus we conclude

(B-C) / log(B-C) = B/C - 1 + o(1)

If you take away one idea from this analysis to use on your own, let it be the point of using differential approximations (or even higher order Taylor series) to replace complicated functions with simpler ones. e.g. once you have the idea to use

log(x-y) = log(x) + Θ(y/x) when y = o(x)

then all of the algebraic calculations you need for your problem simply follow directly.