Finding the recursive relation of an algorithm and then using to find the runtime complexity using the master theorem

45 views Asked by At

I am trying to find the recursive relation of the following function (pseudocode)

    def algorithm(list):
        left_list = []
        right_list = []
        if list.length > 1:
            for i = 0 to list.length:
                if i <= list.length / 2:
                  left_list.add(list.get(i)) //get runs in O(n) time
                else:
                  right_list.add(list.get(i)) //get runs in O(n) time
        left = algorithm(left)
        right = algorithm(right)
        list = combine(left, right) //Combine runs in O(1) time
        return list

The point is to find the run time Theta(n) of the function using the master theorem so I've gone through the following process:

Number of subproblems per recursive call: 2 --> Refer to as A

Size of the subproblems in comparison with input list: n/2 --> refer to as B

Work done for each recursive call: Theta(n^2) //n (the for loop) * n (the get operation) --> refer to exponent as k

This leads me to the following equation:

T(n) = AT(n/b) + Theta(n^k) T(n) = 2T(n/2) + Theta(n^2)

Using the master theorem:

We want to compare log_b(a) to k

log_2(2) = 1

k = 2

1 < 2

This means we fall under the condition of the master theorem that states if log_b(a) < k then T(n) is Theta(n^k) which in my situation means that T(n) = Theta(n^2)

This answer seems to make sense to me but I am really looking for confirmation that I have identified all the pieces correctly and am also correct in how I apply the master theorem.

0

There are 0 answers