Time Complexity of Algorithm

489 views Asked by At

Here is my code for the sorting the list in ascending order. I have used function within the function. Now I want to calculate the Time complexity of this function. From my side I have calculate that the function "unite" is called every time when function "sort" complete its loop. So two functions are used every time in this function. So I have concluded that the Complexity of this function is O(nlog(n)). I am new to this chapter. So I want to know how to calculate this type of complexity. The answer above is just my approximation. And neither I know the real answer nor I have any solution or hints. So please describe your answer whenever you give. Thanks. Here is my code.

def sort(lst):
    def unite(l1, l2):
        if len(l1) == 0:
            return l2
        elif len(l2) == 0:
            return l1
        elif l1[0] < l2[0]:
            return [l1[0]] + unite(l1[1:], l2)
        else:
            return [l2[0]] + unite(l1, l2[1:])

    if len(lst) == 0 or len(lst) == 1:
        return lst
    else:
        front = sort(lst[:len(lst)/2])
        back = sort(lst[len(lst)/2:])

        L = lst[:]  # the next 3 questions below refer to this line
        return unite(front, back)
3

There are 3 answers

1
AAA On

First step is to note that the real work is being done in the unite step of your code, which does n^2 work because you're creating new lists every time.

So you can actually write a quick recurrence for the amount of work your function is doing:

W(n) = 2W(n/2) + n^2

because you're recursing twice on lists that are of length n/2 and doing n^2 work to rejoin them.

Now, consider a recursion tree - at a certain level of the tree (call it level i), you're doing 2^i * (n/2^i)^2 work. That's about O(n^2) work at each level, and there's log(n) levels, so you're doing O(n^2log(n)) work.

However, there is a way to write your unite function so it runs much faster, in O(n) time. In that case, you'd be doing (by a similar analysis as above) O(nlog(n)) work.

0
Rajesh Surana On

Time Complexity (not space complexity) of above code is O(nlog(n)).

There are n elements in the list at start and we recursively divide that list in n/2 elements every time into front and back which makes it O(log(n)) steps. Now, for each of O(log(n)) steps we iterate over each element in l1 and l2 in unite function only once which makes complexity of unite function O(n).

Hence, for O(log(n)) divisions and O(n) unite steps makes this algorithm O(nlog(n)) of time complexity.

Other answers are discussing about space complexity of unite function which is O(n^2), but question title clearly asks about time complexity and not about space complexity.

0
Q Yang On

The time for sort to execute on an n-element vector is T(n).

T(n) =  2 T(n/2) + n

     =  2 [2 T(n/4) + n/2] + n

     =  4 T(n/4) + 2n

     = 4 [2 T(n/8) + n/4] + 2n

     = 8 T(n/8) + 3n
     ........

     = 2k T(n/2k) + k n

     = 2log2 n T(1) + (log2n) n   because T(1)=O(1) 

     = n + n log2 n    

     = O(n log n) 

There is an easy way to remember the complexity for sort solution recursive function. T(selection sort) = O(n^2), and T(merge sort) = O(nlogn). Obviously, your code is merge sort type.