Design and Analysis of AlgorithmsQuicksort
Haidong XueSummer 2012, at GSU
Review of insertion sort and merge sort
• Insertion sort– Algorithm– Worst case number of comparisons = O(?)
• Merge sort– Algorithm– Worst case number of comparisons = O(?)
Sorting AlgorithmsAlgorithm Worst Time Expected Time Extra Memory
Insertion sort O(1)
Merge sort O(n)
Quick sort O(1)
Heap sort O(1)
Quicksort Algorithm
• Input: A[1, …, n]• Output: A[1, .., n], where A[1]<=A[2]…<=A[n]• Quicksort:
1. if(n<=1) return;2. Choose the pivot p = A[n]3. Put all elements less than p on the left; put all
elements lager than p on the right; put p at the middle. (Partition)
4. Quicksort(the array on the left of p)5. Quicksort(the array on the right of p)
Quicksort Algorithm• Quicksort example
2 8 7 1 3 5 6 4
2 871 3 5 64
2 871 3 5 64
Current pivots
Previous pivots
Quicksort
Hi, I am nothing Nothing Jr.
Nothing 3rd
2 871 3 5 64
2 871 3 5 64
2 871 3 5 64
Quicksort Algorithm• More detail about partition• Input: A[1, …, n] (p=A[n])• Output: A[1,…k-1, k, k+1, … n], where A[1, …, k-1]<A[k] and
A[k+1, … n] > A[k], A[k]=p• Partition:
1. t = the tail of smaller array2. from i= 1 to n-1{
if(A[i]<p) {exchange A[t+1] with A[i];update t to the new tail; }
3. exchange A[t+1] with A[n];
Quicksort Algorithm• Partition example
2 8 7 1 3 5 6 4
2 8 7 1 3 5 6 4
2 8 7 1 3 5 6 4
tail
2 8 7 1 3 5 6 4
tail
tail
tail
Exchange 2 with A[tail+1]
Do nothing
Do nothing
Quicksort Algorithm• Partition example
2 8 7 1 3 5 6 4
2 871 3 5 6 4
2 8 71 3 5 6 4
tail
tail
Exchange 1 with A[tail+1]
tail
Exchange 3 with A[tail+1]
Do nothing
2 8 71 3 5 6 4
tail
Do nothing
2 871 3 5 64 Final step: exchange A[n] with A[tail+1]
Quicksort Algorithm
The final version of quick sort:Quicksort(A, p, r){if(p<r){ //if(n<=1) return; q = partition(A, p, r); //small ones on left; //lager ones on right Quicksort(A, p, q-1); Quicksort(A, q+1, r);}}
Analysis of Quicksort
Time complexity– Worst case– Expected
Space complexity - extra memory– 0 = O(1)
Analysis of Quicksort
Worst case• The most unbalanced one --- • Is it the worst case?• Strict proofExpected time complexity
• Strict proof
Strict proof of the worst case time complexity
Proof that 1. When n=1, (1)= 2. Hypothesis: when induction statement: when
Strict proof of the worst case time complexity
Since ,
+
Strict proof of the expected time complexity
• Given A[1, …, n], after sorting them to , the chance for 2 elements, A[i] and A[j], to be compared is .
• The total comparison is calculated as:
Java implementation of Quicksort
• Professional programmers DO NOT implement sorting algorithms by themselves
• We do it for practicing algorithm implementation techniques and understanding those algorithms
• Code quicksort– Sort.java // the abstract class– Quicksort_Haydon.java // my quicksort
implementation– SortingTester.java // correctness tester
Design and Analysis of AlgorithmsReview on time complexity, “in place”, “stable”
Haidong XueSummer 2012, at GSU
Time complexity of algorithms
• Execution time?– Pros: easy to obtain; Cons: not accurate
• Number Instructions?– Pros: very accurate; Cons: calculation is not
straightforward• Number of certain operations?– Pros: easy to calculate, generally accurate; Cons:
not very calculate
Asymptotic Notations
Pros? Cons?
Time complexity of algorithmsIn the worst case
for(int i=0; i<A.length; i++)
C1: cost of “assign a value ”C2: cost of “<”C3: cost of “increment by 1”C4: cost of “==”C5: cost of “return”
C1 + (length+1)*C2+ length*C3
length*C4
0*C5
C5
Assuming C4 >> C1, C2, C3, C5
Worst case T(n) = n*C4 = (n)
Worst case T(n) = C1+C2+ C5 + n(C2+C3+C4)+ = (n)
“in place” and “stable” in sorting algorithms
Algorithm Worst Time Expected Time Extra Memory Stable
Insertion sort O(1) (in place) Can be
Merge sort O(n) Can be
Quick sort O(1) (in place) Can be
Heap sort O(1) (in place) No
2 3 3 1 3 5
2 3 31 3 5
2 331 3 5
Stable
Not stable
Design and Analysis of AlgorithmsHeapsort
Haidong XueSummer 2012, at GSU
Max-Heap
• A complete binary tree, and …
Yes Yes No
every level is completely filled, except possibly the last, which is filled from left to right
Max-Heap
• Satisfy max-heap property: parent >= children16
14 10
8 7 9 3
2
Since it is a complete tree, it can be put into an array without lose its structure information.
Max-Heap
16
14 10
8 7 9 3
2
1 2 3 4 5 6 7 8
Max-Heap• Use an array as a heap
16
14 10
8 7 9 3
2
1 2 3 4 5 6 7 8
16 14 10 8 7 9 3 2
1
2 3
4 5 6 7
8
For element at i:Parent index =parent(i)= floor(i/2);Left child index = left(i)=2*i;Right child index =right(i)=2*i +1Last non-leaf node = floor(length/2)
i=3
2*i = 6
2*i+1=7
floor(i/2)=floor(1.5)=1
floor(length/2)=4
Max-Heapify
• Input: A compete binary tree rooted at i, whose left and right sub trees are max-heaps; last node index
• Output: A max-heap rooted at i.• Algorithm:1. Choose the largest node among node i, left(i), right(i) .2. if(the largest node is not i){– Exchange i with the largest node– Max-Heapify the affected subtree}
Max-Heapify Example
2
16 10
14 7 9 3
8
Heapsort for a heap
• Input: a heap A• Output: a sorted array AAlgorithm:1. Last node index l = A’s last node index2. From the last element to the second{
exchange (current, root);l--;Max-Heapify(A, root, l);
}
Heapsort example
16
14 10
8 7 9 3
2
Heapsort example
14
8 10
2 7 9 3
16
Heapsort example
10
8 9
2 7 3 14
16
Heapsort example
9
8 3
2 7 10 14
16
Heapsort example
8
7 3
2 9 10 14
16
Heapsort example
7
2 3
8 9 10 14
16
Heapsort example
3
2 7
8 9 10 14
16
Array -> Max-Heap
• Input: a array A• Output: a Max-Heap AAlgorithm:Considering A as a complete binary tree, from the last non-leaf node to the first one{
Max-Heapify(A, current index, last index);}
Build Heap Example1614 108 7
14
8 16
10 7
Heapsort• Input: array A• Output: sorted array AAlgorithm:1. Build-Max-Heap(A) 2. Last node index l = A’s last node index3. From the last element to the second{
exchange (current, root);l--;Max-Heapify(A, root, l);
}Let’s try it
Analysis of Heapsort• Input: array A• Output: sorted array AAlgorithm:1. Build-Max-Heap(A) 2. Last node index l = A’s last node index3. From the last element to the second{
exchange (current, root);l--;Max-Heapify(A, root, l);
}
O(n) or O(nlgn)
O(nlgn)
O(n)
Design and Analysis of AlgorithmsNon-comparison sort (sorting in linear time)
Haidong XueSummer 2012, at GSU
Comparison based sorting
• Algorithms that determine sorted order based only on comparisons between the input elements
Algorithm Worst Time Expected Time Extra Memory Stable
Insertion sort O(1) (in place) Can be
Merge sort O(n) Can be
Quick sort O(1) (in place) Can be
Heap sort O(1) (in place) No
What is the lower bound?
Lower bounds for comparison based sorting
<?
<? <?
<?
Y N
Y
……
…………………………..
Done
N
Done
Y
For n element array, how many possible inputs are there?
Factorial of n ----- n!
What is the shortest tree can have n! leaves?
A perfect tree,
As a result ……
h≥ lg (𝑛! )=Ω(𝑛𝑙𝑔𝑛)
Sorting in linear time
• Can we sort an array in linear time?• Yes, but not for free• E.g. sort cards with 13 slots• What if there are more than one elements in
the same slot?
Counting Sort• Input: array A[1, … , n]; k (elements in A have values from 1 to k)
• Output: sorted array AAlgorithm: 1. Create a counter array C[1, …, k]2. Create an auxiliary array B[1, …, n]3. Scan A once, record element frequency in C4. Calculate prefix sum in C5. Scan A in the reverse order, copy each element to B at the
correct position according to C.6. Copy B to A
Counting Sort2 5 3 6 2 3 7 3A:
C: 1 2 3 4 5 60 2 3 0 1 1
71
1 2 3 4 5 6 7 8
B:1 2 3 4 5 6 7 8
2 55 76 8Position indicator: 0
3
4
7
7
3
3
2
1
6352
2 650
Analysis of Counting Sort• Input: array A[1, … , n]; k (elements in A have values from 1 to k)
• Output: sorted array AAlgorithm: 1. Create a counter array C[1, …, k]2. Create an auxiliary array B[1, …, n]3. Scan A once, record element frequency in C4. Calculate prefix sum in C5. Scan A in the reverse order, copy each element to B at the
correct position according to C.6. Copy B to A
Time
O(n)
O(k)
O(n)
O(n+k)=O(n) (if k=O(n))
Space
O(k)
O(n)
O(n)
O(n+k)=O(n) (if k=O(n))
Radix-Sort
• Input: array A[1, … , n]; d (number of digit a element has)
• Output: sorted array AAlgorithm: for each digit{ use a stable sort to sort A on a digit}
T(n)=O(d(n+k))
SummaryAlgorithm Worst Time Expected Time Extra Memory Stable
Insertion sort O(1) (in place) Yes
Merge sort O(n) Yes
Quick sort O(1) (in place) Yes
Heap sort O(1) (in place) No
Counting sort Yes
Design strategies:Divide and conquer
Employ certain special data structure
Tradeoff between time and space
Knowledge treeAlgorithms
Analysis DesignAlgorithms for
classic problemsClassic data
structure
Asymptotic notations
Probabilistic analysis
Sorting Shortest path
Matrix multiplication …
Divide & Conquer
GreedyDynamic
Programming
O(), o(), (), (), ()
Heap,Hashing,
Binary Tree,RBT,….
Quicksort,Heapsort,
Mergesort,…
… … …… … …
… … … … … … … … … … … ……