Asymptotic complexity for typical expressions

1.3k views Asked by At

The increasing order of following functions shown in the picture below in terms of asymptotic complexity is:

(A) f1(n); f4(n); f2(n); f3(n)

(B) f1(n); f2(n); f3(n); f4(n);

(C) f2(n); f1(n); f4(n); f3(n)

(D) f1(n); f2(n); f4(n); f3(n)

enter image description here

a)time complexity order for this easy question was given as--->(n^0.99)*(logn) < n ......how? log might be a slow growing function but it still grows faster than a constant

b)Consider function f1 suppose it is f1(n) = (n^1.0001)(logn) then what would be the answer?

whenever there is an expression which involves multiplication between logarithimic and polynomial expression , does the logarithmic function outweigh the polynomial expression?

c)How to check in such cases suppose

1)(n^2)logn vs (n^1.5) which has higher time complexity? 2) (n^1.5)logn vs (n^2) which has higher time complexity?

3

There are 3 answers

7
John M On BEST ANSWER

If we consider C_1 and C_2 such that C_1 < C_2, then we can say the following with certainty

(n^C_2)*log(n) grows faster than (n^C_1)

This is because (n^C_1) grows slower than (n^C_2) (obviously)

also, for values of n larger than 2 (for log in base 2), log(n) grows faster than 
1.

in fact, log(n) is asymptotically greater than any constant C,
because log(n) -> inf as n -> inf

if both (n^C_2) is asymptotically than (n^C_1) AND log(n) is asymptotically greater 
than 1, then we can certainly say that 
(n^2)log(n) has greater complexity than (n^1.5)

We think of log(n) as a "slowly growing" function, but it still grows faster than 1, which is the key here.


coder101 asked an interesting question in the comments, essentially,

is n^e = Ω((n^c)*log_d(n))?
where e = c + ϵ for arbitrarily small ϵ

Let's do some algebra.

n^e = (n^c)*(n^ϵ)
so the question boils down to
is n^ϵ = Ω(log_d(n))
or is it the other way around, namely:
is log_d(n) = Ω(n^ϵ)

In order to do this, let us find the value of ϵ that satisfies n^ϵ > log_d(n).

n^ϵ > log_d(n)
ϵ*ln(n) > ln(log_d(n))
ϵ > ln(log_d(n)) / ln(n)

Because we know for a fact that

ln(n) * c > ln(ln(n))        (1)
as n -> infinity

We can say that, for an arbitrarily small ϵ, there exists an n large enough to 
satisfy ϵ > ln(log_d(n)) / ln(n)

because, by (1), ln(log_d(n)) / ln(n) ---> 0 as n -> infinity.

With this knowledge, we can say that

is n^ϵ = Ω(log_d(n))
for arbitrarily small ϵ

which means that
n^(c + ϵ) = Ω((n^c)*log_d(n))
for arbitrarily small ϵ.

in layperson's terms

n^1.1 > n * ln(n)
for some n

also

n ^ 1.001 > n * ln(n)
for some much, much bigger n

and even
n ^ 1.0000000000000001 > n * ln(n)
for some very very big n.
0
A.S.H On

Replacing f1 = (n^0.9999)(logn) by f1 = (n^1.0001)(logn) will yield answer (C): n, (n^1.0001)(logn), n^2, 1.00001^n

The reasoning is as follows:

. (n^1.0001)(logn) has higher complexity than n, obvious.

. n^2 higher than (n^1.0001)(logn) because the polynomial part asymptotically dominates the logarithmic part, so the higher-degree polynomial n^2 wins

. 1.00001^n dominates n^2 because the 1.00001^n has exponential growth, while n^2 has polynomial growth. Exponential growth asymptotically wins.

BTW, 1.00001^n looks a little similar to a family called "sub-exponential" growth, usually denoted (1+Ɛ)^n. Still, whatever small is Ɛ, sub-exponential growth still dominates any polynomial growth.

0
Downhillski On

The complexity of this problem lays between f1(n) and f2(n).

For f(n) = n ^ c where 0 < c < 1, the curve growth will eventually be so slow that it would become so trivial compared with a linear growth curve.

For f(n) = logc(n), where c > 1, the curve growth will eventually be so slow that it would become so trivial compared with a linear growth curve.

The product of such two functions will also eventually become trivial compared with a linear growth curve.

Hence, Theta(n ^ c * logc(n)) is asymptotically less complex than Theta(n).