Return to the Section 101 Homepage.


Lab 6 (07-13)

Here is my solution to lab 6:

I began by looking at the number of moves required by each algorithm to sort an already sorted list (of size 128). I knew that Selection sort and Insertion sort didn't move any elements when sorting such a list. I went through each option and found that only Gamma required 0 moves, so I knew it was either Selection or Insertion sort. Unfortunately (due to a bug in the program), none of the others indicated that they required 0 moves. I increased the size of the sorted list to 256 and then 512 and noticed that Epsilon consistently required 2 less than twice the list size and Alpha required 1 less than the list size, while the others (Beta and Delta) required "much more" than this. I concluded that Alpha, Epsilon, and Gamma were good candidates for Insertion and Selection sort.

Knowing that Insertion and Selection sort were both O(n^2) algorithms with random input, I decided to look at run times on random lists. With a list of size 16384, I found that Alpha had a run time of 5, Beta: 8, Gamma: 831, Delta: 11, and Epsilon: 642. Clearly, Gamma and Epsilon (which also had low move counts on sorted input) were Selection and Insertion sort.

I then needed to figure out which one was which. I knew that Selection sort always required O(n^2) time, no matter the input, while Insertion sort would run in O(n) on sorted input. So I looked at the run times of Gamma and Epsilon with a sorted list of size 16384. I found that Gamma required a time of approx 900, while Epsilon ran in 1. I decided that Epsilon was Insertion sort and Gamma was Selection sort.

From there, I decied to look at run times. I knew that, on random input, Quick sort should be the quickest, Merge sort should run in O(nlogn), and the most efficient version of Shell sort (which scales its gapSize by 2.2 each iteration) should run in O(n^7/6) (this is described in the book). Testing with random lists of size 2^k from k=13 to 18 (that's 8192 to 262144), it became clear that Alpha was Quick sort. I also found that Beta ran faster than Delta, and was therefore a good candidate for Shell sort. But, not being sure if the program used the best version of Shell sort (which could have run time up to O(n^2)), I decided to look deeper.

The number of comparisons and moves (as well as the run time) for Merge sort depends only on the size of the input (and not its type). Therefore, I created a number of random lists, each of size 16384, and ran Beta and Delta on them. With each one, the number of comparisons and moves required for Beta changed, while the number for Delta stayed exactly the same. I concluded that Delta was, in fact, Merge sort, and Beta was Shell sort.

You can elicit the type of one of the sorts by triggering a bug in the program. Specifically, the program's quick sort implementation will throw a StackOverflowError when trying to sort a large, already sorted list (size 16384 worked for me). A StackOverflowError results from "excessive" recursion. Therefore, it is reasonable to assume that this specific Quick sort implementation makes the poor choice of using the first element in the list as the pivot. With an already-sorted list, this leads to run time of O(n^2), as well as a recursion depth of O(n).

Had the programs author chosen the pivot from the middle of the list or used the median-of-three method (both described in the book), the algorithm would have run in a quick O(nlogn).

Isn't blackbox analysis fun?



Return to the Section 101 Homepage.