Asymptotic lower bounding does not work. Why?

184 views Asked by At

Ok so this is not a homework question obviously, but here is the thing: The bubble sort algorithm is said to be O(n^2), Ω(n). However, if we plot its time complexity as a graph (average case), and try to lower bound it, a more accurate lower bound would be Ω(n^2). However contextually we see that Ω(n) is correct. So why does lower bounding the algorithm's runtime does not work?

3

There are 3 answers

11
Matt Timmermans On BEST ANSWER

I think you're mixing up concepts:

  1. Lower bound vs upper bound: Ω(f(n)) is a lower bound, and O(f(n)) is an upper bound.

  2. Best vs worst case: Unless otherwise stated, both Ω(f(n)) and O(f(n)) refer to the worst case running time, as a function of the input size.

For bubble sort, the worst case input of size n is guaranteed to take quadratic time, so bubble sort is O(n2) and Ω(n2).

The reason "bubble sort is said to be Ω(n)" is that a lot of people mess this up.

Remember that Ω(f(n)) is the set of functions that are asymptotically bounded underneath by f(n), and when you see "Algorithm X is Ω(f(n))", read it "The worst case running time of Algorithm X is in Ω(f(n))".

(Note however that Dmitry's answer is also also correct. Because omega is a lower bound, Ω(1), Ω(log n), Ω(n), Ω(n log n), and Ω(n2) all apply)

0
Dmitriy Dobrotvorskiy On

Ω(n) by definition is lower bound and does not have to be tight. It just guarantees that algorithm works not better than linearly.

Ω(1), Ω(log(n)) are also valid lower bounds for bubble sort execution time. Not very informative, but still valid.

0
Stef On

The following statements are all true:

  • For every number n, Bubble Sort never performs more than n(n-1)/2 comparisons and n(n-1)/2 swaps on a list of length n
  • For every number n, there exists a list of length n on which bubble sort performs exactly n(n-1)/2 comparisons and n(n-1)/2 swaps (example: a list sorted in reverse order)
  • For every number n, Bubble sort never performs fewer than (n-1) comparisons on a list of length n
  • For every number n, there exists a list of length n on which bubble sorts performs exactly (n-1) comparisons and 0 swaps (example: a sorted list)
  • For every number n, if we take the list [1,2,...,n] and shuffle it uniformly at random, then the expected number of swaps of bubble sort is n(n-1)/4 (explanation: this answer on CS.stackexchange)

Conclusion: if we forget about the multiplicative constants, the worst-case running time is n^2, the best-case running time of bubble sort is n, and the average running time is n^2.

Note that "the average running time" is always a little ambiguous. I stated it differently in my last bullet point to remove the ambiguity. If your random lists follow a different probability distribution, then you will get a different result. In particular, bubble sort will need fewer swaps (and maybe fewer comparisons) on a list that has lots of duplicate elements. But if you don't care about the exact multiplicative constant, then most "reasonable" ways to generate "random lists" will lead to an expected complexity of n^2.

Now, your question might become: Why is the expected running time of bubble sort so close to its worst-case running time, and so far from its best-case running time?

The answer to this question is: the immense majority of permutations are "reasonably well shuffled", and only a small number of permutations are "almost sorted". In other words, if you shuffle a list at random, then the result will "look random". If you shuffle a deck of cards, the probability that you will accidentally sort the cards is very low.

The question is also a bit deceptive: the worst-case number of comparisons is n^2/2; the expected number of comparisons is n^2/4; the best-case number of comparisons is n-1. So actually, the expected number of comparisons is almost exactly half-way between the worst case and the best case.

The answer on CS.stackexchange is also a good explanation: of all pairs of elements in the list, we expect half the pairs to be in their relative order, and half the pairs to be in reverse relative order. So the expected running time is exactly half of the worst-case running time.