Running time of Kruskal's algorithm by varying the sorting time

1.9k views Asked by At

I was doing an analysis of minimum spanning trees and was wondering how the sorting time would effect the overall time complexity of Kruskal's algorithm?

Example:

  1. If sorting can be done in O(n log log n)
  2. If sorting can be done in O(n)

Would the answer still remain O(e log n) for both the cases or would it change?

2

There are 2 answers

0
Ahmad On

The time of Kruskal algorithm is O(e log e) and its the time for sorting the edges. If you can do it in O(e), considering the rest of algorithm to find the minimum spanning tree is O (e log n), you have O(e) + O(e log n) . Since e=O(n^2) then the algorithm time would be O(n^2 log n) or O(e log n). if sorting takes O(e log log e) with the same analysis the overall time would be O(e log log e).

Details: The time for finding minimum spanning tree is calculated by the time you need for sorting edges and then a loop (e times) in which you remove each edge from the sorted list and examine if it connects two disjoint regions or not. (this check needs O (log n) ) and the time of the loop would be O(e log n) as mentioned above.

using more sophisticated disjoint-set data structure to find and check disjoint regions the loop has O(E α(V)) time, where α is the extremely slowly growing inverse of the single-valued Ackermann function (WikiPedia)

0
uchar On

If you use disjoint set for implementing kruskal algorithm complexity will be SortComplexity+Eα(E)

(E is number of edges and alpha is very slowly growing function (according to wikipedia less than 5 for practial value of n))

So If sorting can be done in O(n) then the complexity of kruskal will be O(E α(E))

And if sorting complexity is O(nloglogn) the complexity of kruskal will be O(EloglogE) and for dense graph it will be O(v^2loglogv) (v is number of vertices)