+ All Categories
Home > Technology > Sorting2

Sorting2

Date post: 14-Jun-2015
Category:
Upload: saurabh-mishra
View: 824 times
Download: 0 times
Share this document with a friend
Description:
Data Structures Sorting 2
Popular Tags:
80
Transcript
Page 1: Sorting2
Page 2: Sorting2
Page 3: Sorting2
Page 4: Sorting2

Heap: array implementation

1

2

4

8 9 10

5

3

76

Is it a good idea to store arbitrary binary trees as arrays? May have many empty spaces!

Page 5: Sorting2

Array implementation

The root node is A[1].

The left child of A[j] is A[2j]

The right child of A[j] is A[2j + 1]

The parent of A[j] is A[j/2] (note: integer divide)

1

2 5

64 3

1 2 5 4 3 6

Need to estimate the maximum size of the heap.

Page 6: Sorting2

Heapsort

(1) Build a binary heap of N elements – the minimum element is at the top of the heap

(2) Perform N DeleteMin operations– the elements are extracted in sorted order

(3) Record these elements in a second array and then copy the array back

Page 7: Sorting2

Heapsort – running time analysis

(1) Build a binary heap of N elements – repeatedly insert N elements O(N log N) time (there is a more efficient way)

(2) Perform N DeleteMin operations– Each DeleteMin operation takes O(log N) O(N log N)

(3) Record these elements in a second array and then copy the array back– O(N)

• Total: O(N log N)• Uses an extra array

Page 8: Sorting2

Heapsort: no extra storage

• After each deleteMin, the size of heap shrinks by 1– We can use the last cell just freed up to store the element that was

just deleted after the last deleteMin, the array will contain the elements in

decreasing sorted order

• To sort the elements in the decreasing order, use a min heap• To sort the elements in the increasing order, use a max heap

– the parent has a larger element than the child

Page 9: Sorting2

Heapsort

Sort in increasing order: use max heap

Delete 97

Page 10: Sorting2

Heapsort: A complete example

Delete 16 Delete 14

Page 11: Sorting2

Example (cont’d)

Delete 10 Delete 9 Delete 8

Page 12: Sorting2

Example (cont’d)

Page 13: Sorting2

Lower bound for sorting,radix sort

Page 14: Sorting2

Lower Bound for Sorting

• Mergesort and heapsort– worst-case running time is O(N log N)

• Are there better algorithms?• Goal: Prove that any sorting algorithm based

on only comparisons takes (N log N) comparisons in the worst case (worse-case input) to sort N elements.

Page 15: Sorting2

Lower Bound for Sorting

• Suppose we want to sort N distinct elements• How many possible orderings do we have for

N elements?• We can have N! possible orderings (e.g., the

sorted output for a,b,c can be a b c, b a c, a c b, c a b, c b a, b c a.)

Page 16: Sorting2

Lower Bound for Sorting

• Any comparison-based sorting process can be represented as a binary decision tree.– Each node represents a set of possible orderings,

consistent with all the comparisons that have been made

– The tree edges are results of the comparisons

Page 17: Sorting2

Decision tree forAlgorithm X for sortingthree elements a, b, c

Page 18: Sorting2

Lower Bound for Sorting• A different algorithm would have a different decision tree• Decision tree for Insertion Sort on 3 elements:

Page 19: Sorting2

Lower Bound for Sorting

• The worst-case number of comparisons used by the sorting algorithm is equal to the depth of the deepest leaf– The average number of comparisons used is equal to the average

depth of the leaves

• A decision tree to sort N elements must have N! leaves– a binary tree of depth d has at most 2d leaves the tree must have depth at least log2 (N!)

• Therefore, any sorting algorithm based on only comparisons between elements requires at least

log2(N!) comparisons in the worst case.

Page 20: Sorting2

Lower Bound for Sorting

• Any sorting algorithm based on comparisons between elements requires (N log N) comparisons.

Page 21: Sorting2

Linear time sorting

• Can we do better (linear time algorithm) if the input has special structure (e.g., uniformly distributed, every numbers can be represented by d digits)? Yes.

• Counting sort, radix sort

Page 22: Sorting2

Counting Sort• Assume N integers to be sorted, each is in the range 1 to M.• Define an array B[1..M], initialize all to 0 O(M)

• Scan through the input list A[i], insert A[i] into B[A[i]] O(N)

• Scan B once, read out the nonzero integers O(M)Total time: O(M + N)

– if M is O(N), then total time is O(N)– Can be bad if range is very big, e.g. M=O(N2)

N=7, M = 9,

Want to sort 8 1 9 5 2 6 3

1 2 5 8 9

Output: 1 2 3 5 6 8 9

3 6

Page 23: Sorting2

Counting sort

• What if we have duplicates?• B is an array of pointers.• Each position in the array has 2 pointers: head

and tail. Tail points to the end of a linked list, and head points to the beginning.

• A[j] is inserted at the end of the list B[A[j]]• Again, Array B is sequentially traversed and each

nonempty list is printed out.• Time: O(M + N)

Page 24: Sorting2

M = 9,

Wish to sort 8 5 1 5 9 5 6 2 7

1 2 5 6 7 8 9

Output: 1 2 5 5 5 6 7 8 9

5

5

Counting sort

Page 25: Sorting2

Radix Sort

• Extra information: every integer can be represented by at most k digits– d1d2…dk where di are digits in base r

– d1: most significant digit

– dk: least significant digit

Page 26: Sorting2

Radix Sort

• Algorithm– sort by the least significant digit first (counting

sort) => Numbers with the same digit go to same bin– reorder all the numbers: the numbers in bin 0

precede the numbers in bin 1, which precede the numbers in bin 2, and so on

– sort by the next least significant digit– continue this process until the numbers have been

sorted on all k digits

Page 27: Sorting2

Radix Sort

• Least-significant-digit-first

Example: 275, 087, 426, 061, 509, 170, 677, 503

Page 28: Sorting2
Page 29: Sorting2

Radix Sort

• Does it work?

• Clearly, if the most significant digit of a and b are different and a < b, then finally a comes before b

• If the most significant digit of a and b are the same, and the second most significant digit of b is less than that of a, then b comes before a.

Page 30: Sorting2

Radix Sort

Example 2: sorting cards– 2 digits for each card: d1d2

– d1 = : base 4•

– d2 = A, 2, 3, ...J, Q, K: base 13• A 2 3 ... J Q K

– 2 2 5 K

Page 31: Sorting2

// base 10

// d times of counting sort

// re-order back to original array

// scan A[i], put into correct slot

// FIFO

Page 32: Sorting2

Radix Sort

• Increasing the base r decreases the number of passes

• Running time– k passes over the numbers (i.e. k counting sorts,

with range being 0..r)– each pass takes O(N+r)– total: O(Nk+rk)– r and k are constants: O(N)

Page 33: Sorting2

Quicksort

Page 34: Sorting2

Introduction

• Fastest known sorting algorithm in practice• Average case: O(N log N)• Worst case: O(N2)

– But, the worst case seldom happens.• Another divide-and-conquer recursive

algorithm like mergesort

Page 35: Sorting2

Quicksort

• Divide step: – Pick any element (pivot) v in S – Partition S – {v} into two disjoint groups S1 = {x S – {v} | x v} S2 = {x S – {v} | x v}

• Conquer step: recursively sort S1 and S2

• Combine step: combine the sorted S1, followed by v, followed by the sorted S2

v

v

S1 S2

S

Page 36: Sorting2

Example: Quicksort

Page 37: Sorting2

Example: Quicksort...

Page 38: Sorting2

Pseudocode Input: an array A[p, r]

Quicksort (A, p, r) { if (p < r) {

q = Partition (A, p, r) //q is the position of the pivot element

Quicksort (A, p, q-1)Quicksort (A, q+1, r)

}}

Page 39: Sorting2

Partitioning

• Partitioning – Key step of quicksort algorithm– Goal: given the picked pivot, partition the

remaining elements into two smaller sets– Many ways to implement – Even the slightest deviations may cause

surprisingly bad results.• We will learn an easy and efficient

partitioning strategy here.• How to pick a pivot will be discussed later

Page 40: Sorting2

Partitioning Strategy

• Want to partition an array A[left .. right]• First, get the pivot element out of the way by swapping it

with the last element. (Swap pivot and A[right])• Let i start at the first element and j start at the next-to-last

element (i = left, j = right – 1)

pivot i j

5 6 4 6 3 12 19 5 6 4 63 1219

swap

Page 41: Sorting2

Partitioning Strategy• Want to have

– A[p] <= pivot, for p < i– A[p] >= pivot, for p > j

• When i < j– Move i right, skipping over elements smaller than the pivot– Move j left, skipping over elements greater than the pivot– When both i and j have stopped

• A[i] >= pivot• A[j] <= pivot

i j

5 6 4 63 1219

i j

5 6 4 63 1219

Page 42: Sorting2

Partitioning Strategy

• When i and j have stopped and i is to the left of j– Swap A[i] and A[j]

• The large element is pushed to the right and the small element is pushed to the left

– After swapping• A[i] <= pivot• A[j] >= pivot

– Repeat the process until i and j cross

swap

i j

5 6 4 63 1219

i j

5 3 4 66 1219

Page 43: Sorting2

Partitioning Strategy

• When i and j have crossed– Swap A[i] and pivot

• Result:– A[p] <= pivot, for p < i– A[p] >= pivot, for p > i

i j

5 3 4 66 1219

ij

5 3 4 66 1219

ij

5 3 4 6 6 12 19

Page 44: Sorting2

Small arrays

• For very small arrays, quicksort does not perform as well as insertion sort– how small depends on many factors, such as the

time spent making a recursive call, the compiler, etc

• Do not use quicksort recursively for small arrays– Instead, use a sorting algorithm that is efficient for

small arrays, such as insertion sort

Page 45: Sorting2

Picking the Pivot• Use the first element as pivot

– if the input is random, ok– if the input is presorted (or in reverse order)

• all the elements go into S2 (or S1)• this happens consistently throughout the recursive calls• Results in O(n2) behavior (Analyze this case later)

• Choose the pivot randomly– generally safe– random number generation can be expensive

Page 46: Sorting2

Picking the Pivot• Use the median of the array

– Partitioning always cuts the array into roughly half– An optimal quicksort (O(N log N))– However, hard to find the exact median

• e.g., sort an array to pick the value in the middle

Page 47: Sorting2

Pivot: median of three• We will use median of three

– Compare just three elements: the leftmost, rightmost and center– Swap these elements if necessary so that

• A[left] = Smallest• A[right] = Largest• A[center] = Median of three

– Pick A[center] as the pivot– Swap A[center] and A[right – 1] so that pivot is at second last position (why?)

median3

Page 48: Sorting2

Pivot: median of three

pivot

5 6 4

6

3 12 192 13 6

5 6 4 3 12 192 6 13

A[left] = 2, A[center] = 13, A[right] = 6

Swap A[center] and A[right]

5 6 4 3 12 192 13

pivot

65 6 4 3 12192 13

Choose A[center] as pivot

Swap pivot and A[right – 1]

Note we only need to partition A[left + 1, …, right – 2]. Why?

Page 49: Sorting2

Main Quicksort Routine

For small arrays

Recursion

Choose pivot

Partitioning

Page 50: Sorting2

Partitioning Part

• Works only if pivot is picked as median-of-three. – A[left] <= pivot and A[right] >= pivot– Thus, only need to partition A[left + 1,

…, right – 2]

• j will not run past the end– because a[left] <= pivot

• i will not run past the end– because a[right-1] = pivot

Page 51: Sorting2

Quicksort Faster than Mergesort• Both quicksort and mergesort take O(N log

N) in the average case.• Why is quicksort faster than mergesort?

– The inner loop consists of an increment/decrement (by 1, which is fast), a test and a jump.

– There is no extra juggling as in mergesort.

inner loop

Page 52: Sorting2

Analysis

• Assumptions:– A random pivot (no median-of-three

partitioning– No cutoff for small arrays

• Running time– pivot selection: constant time O(1)– partitioning: linear time O(N)– running time of the two recursive calls

• T(N)=T(i)+T(N-i-1)+cN where c is a constant– i: number of elements in S1

Page 53: Sorting2

Worst-Case Analysis

• What will be the worst case?– The pivot is the smallest element, all the time– Partition is always unbalanced

Page 54: Sorting2

Best-case Analysis• What will be the best case?

– Partition is perfectly balanced.– Pivot is always in the middle (median of the

array)

Page 55: Sorting2

Average-Case Analysis

• Assume– Each of the sizes for S1 is equally likely

• This assumption is valid for our pivoting (median-of-three) and partitioning strategy

• On average, the running time is O(N log N) (covered in comp271)

Page 56: Sorting2

Topological Sort• Topological sort is an algorithm for a directed acyclic graph• It can be thought of as a way to linearly order the vertices

so that the linear order respects the ordering relations implied by the arcs

01

2

3

45

6

7

8

9

For example:0, 1, 2, 5, 9

0, 4, 5, 9

0, 6, 3, 7 ?

Page 57: Sorting2

Topological Sort• Idea:

– Starting point must have zero indegree!– If it doesn’t exist, the graph would not be

acyclic

1. A vertex with zero indegree is a task that can start right away. So we can output it first in the linear order

2. If a vertex i is output, then its outgoing arcs (i, j) are no longer useful, since tasks j does not need to wait for i anymore- so remove all i’s outgoing arcs

3. With vertex i removed, the new graph is still a directed acyclic graph. So, repeat step 1-2 until no vertex is left.

Page 58: Sorting2

Topological Sort

Find all starting points

Reduce indegree(w)

Place new startvertices on the Q

Page 59: Sorting2

Example

01

2

3

45

6

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

1

2

1

1

2

1

1

2

2

Indegree

start

Q = { 0 }

OUTPUT: 0

Page 60: Sorting2

Example

01

2

3

45

6

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

1

2

1

1

2

1

1

2

2

Indegree

Dequeue 0 Q = { } -> remove 0’s arcs – adjust indegrees of neighbors

OUTPUT:

Decrement 0’sneighbors

-1

-1

-1

Page 61: Sorting2

Example

01

2

3

45

6

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

2

1

0

2

0

1

2

2

Indegree

Dequeue 0 Q = { 6, 1, 4 } Enqueue all starting points

OUTPUT: 0

Enqueue allnew start points

Page 62: Sorting2

Example

12

3

45

6

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

2

1

0

2

0

1

2

2

Indegree

Dequeue 6 Q = { 1, 4 } Remove arcs .. Adjust indegrees of neighbors

OUTPUT: 0 6

Adjust neighborsindegree

-1

-1

Page 63: Sorting2

Example

12

3

45

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

1

0

0

2

0

1

2

2

Indegree

Dequeue 6 Q = { 1, 4, 3 } Enqueue 3

OUTPUT: 0 6

Enqueue newstart

Page 64: Sorting2

Example

12

3

45

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

1

0

0

2

0

1

2

2

Indegree

Dequeue 1 Q = { 4, 3 } Adjust indegrees of neighbors

OUTPUT: 0 6 1

Adjust neighborsof 1

-1

Page 65: Sorting2

Example

2

3

45

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

2

0

1

2

2

Indegree

Dequeue 1 Q = { 4, 3, 2 } Enqueue 2

OUTPUT: 0 6 1

Enqueue new starting points

Page 66: Sorting2

Example

2

3

45

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

2

0

1

2

2

Indegree

Dequeue 4 Q = { 3, 2 } Adjust indegrees of neighbors

OUTPUT: 0 6 1 4

Adjust 4’s neighbors

-1

Page 67: Sorting2

Example

2

3

5

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

1

0

1

2

2

Indegree

Dequeue 4 Q = { 3, 2 } No new start points found

OUTPUT: 0 6 1 4

NO new startpoints

Page 68: Sorting2

Example

2

3

5

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

1

0

1

2

2

Indegree

Dequeue 3 Q = { 2 } Adjust 3’s neighbors

OUTPUT: 0 6 1 4 3

-1

Page 69: Sorting2

Example

2

5

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

1

0

1

1

2

Indegree

Dequeue 3 Q = { 2 } No new start points found

OUTPUT: 0 6 1 4 3

Page 70: Sorting2

Example

2

5

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

1

0

1

1

2

Indegree

Dequeue 2 Q = { } Adjust 2’s neighbors

OUTPUT: 0 6 1 4 3 2

-1

-1

Page 71: Sorting2

Example

5

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

0

0

0

1

2

Indegree

Dequeue 2 Q = { 5, 7 } Enqueue 5, 7

OUTPUT: 0 6 1 4 3 2

Page 72: Sorting2

Example

5

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

0

0

0

1

2

Indegree

Dequeue 2 Q = { 5, 7 } Enqueue 5, 7

OUTPUT: 0 6 1 4 3 2

Page 73: Sorting2

Example

5

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

0

0

0

1

2

Indegree

Dequeue 5 Q = { 7 }Adjust neighbors

OUTPUT: 0 6 1 4 3 2 5

-1

Page 74: Sorting2

Example

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

0

0

0

1

1

Indegree

Dequeue 5 Q = { 7 }No new starts

OUTPUT: 0 6 1 4 3 2 5

Page 75: Sorting2

Example

7

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

0

0

0

1

1

Indegree

Dequeue 7 Q = { }Adjust neighbors

OUTPUT: 0 6 1 4 3 2 5 7

-1

Page 76: Sorting2

Example

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

0

0

0

0

1

Indegree

Dequeue 7 Q = { 8 }Enqueue 8

OUTPUT: 0 6 1 4 3 2 5 7

Page 77: Sorting2

Example

8

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

0

0

0

0

1

Indegree

Dequeue 8 Q = { } Adjust indegrees of neighbors

OUTPUT: 0 6 1 4 3 2 5 7 8

-1

Page 78: Sorting2

Example

9

0

1

2

3

4

5

6

7

8

9

2

6 1 4

7 5

8

5

3 2

8

9

9

0

1

2

3

4

5

6

7

8

9

0

0

0

0

0

0

0

0

0

0

Indegree

Dequeue 8 Q = { 9 } Enqueue 9Dequeue 9 Q = { } STOP – no neighbors

OUTPUT: 0 6 1 4 3 2 5 7 8 9

Page 79: Sorting2

Example

OUTPUT: 0 6 1 4 3 2 5 7 8 9

01

2

3

45

6

7

8

9

Is output topologically correct?

Page 80: Sorting2

Topological Sort: Complexity• We never visited a vertex more than one time

• For each vertex, we had to examine all outgoing edges– Σ outdegree(v) = m– This is summed over all vertices, not per vertex

• So, our running time is exactly– O(n + m)