sorting algos > data structures & algorithums

64

Upload: ain-ul-moiz-khawaja

Post on 30-Nov-2014

507 views

Category:

Education


0 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Sorting algos  > Data Structures & Algorithums
Page 2: Sorting algos  > Data Structures & Algorithums

Merge Sort

7 2 9 4 2 4 7 9

7 2 2 7 9 4 4 9

7 7 2 2 9 9 4 4

Page 3: Sorting algos  > Data Structures & Algorithums

Merge SortMerge sort is based

on the divide-and-conquer paradigm. It consists of three steps:Divide: partition input

sequence S into two sequences S1 and S2 of about n2 elements each

Recur: recursively sort S1 and S2

Conquer: merge S1 and S2 into a unique sorted sequence

Algorithm mergeSort(S, C)Input sequence S, comparator C Output sequence S sorted

according to Cif S.size() > 1 {

(S1, S2) := partition(S, S.size()/2)

S1 := mergeSort(S1, C)

S2 := mergeSort(S2, C)

S := merge(S1, S2)} return(S)

Page 4: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer

Merge Sort Execution Tree (recursive calls)An execution of merge-sort is depicted by a binary

treeeach node represents a recursive call of merge-sort and stores

unsorted sequence before the execution and its partition sorted sequence at the end of the execution

the root is the initial call the leaves are calls on subsequences of size 0 or 1

7 2 9 4 2 4 7 9

7 2 2 7 9 4 4 9

7 7 2 2 9 9 4 4

Page 5: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer

Execution ExamplePartition

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 6: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer

Execution Example (cont.)Recursive call, partition

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 7: Sorting algos  > Data Structures & Algorithums

Execution Example (cont.)Recursive call, partition

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 8: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer

Execution Example (cont.)Recursive call, base case

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 9: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer

Execution Example (cont.)Recursive call, base case

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 10: Sorting algos  > Data Structures & Algorithums

Execution Example (cont.)Merge

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 11: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer

Execution Example (cont.)Recursive call, …, base case, merge

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

9 9 4 4

Page 12: Sorting algos  > Data Structures & Algorithums

Execution Example (cont.)Merge

7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 13: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer

Execution Example (cont.)Recursive call, …, merge, merge

7 2 9 4 2 4 7 9 3 8 6 1 1 3 6 8

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 14: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer 14

Execution Example (cont.)Merge

7 2 9 4 2 4 7 9 3 8 6 1 1 3 6 8

7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6

7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1

7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

Page 15: Sorting algos  > Data Structures & Algorithums

Static Method mergeSort()Public static void mergeSort(a, int left, int right){// sort a[left:right]if (left < right){// at least two elements int mid = (left+right)/2; //midpoint mergeSort(a, left, mid); mergeSort(a, mid + 1, right); merge(a, b, left, mid, right); //merge from a to b copy(b, a, left, right); //copy result back to a}

}

Page 16: Sorting algos  > Data Structures & Algorithums

MERGE-SORT(A,p,r)1. if lo < hi

2. then mid (lo+hi)/23. MERGE-SORT(A,lo,mid)

4. MERGE-SORT(A,mid+1,hi)

5. MERGE(A,lo,mid,hi)

Call MERGE-SORT(A,1,n) (assume n=length of list A)

A = {10, 5, 7, 6, 1, 4, 8, 3, 2, 9}

Page 17: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer

Another Analysis of Merge-SortThe height h of the merge-sort tree is O(log n)

at each recursive call we divide in half the sequence, The work done at each level is O(n)

At level i, we partition and merge 2i sequences of size n2i Thus, the total running time of merge-sort is O(n log n)

depth #seqs size Cost for level

0 1 n n

1 2 n2 n

i 2i n2i n

… … …

logn 2logn = n n/2logn = 1 n

Page 18: Sorting algos  > Data Structures & Algorithums

Summary of Sorting Algorithms (so far)

Vectors

Algorithm Time Notes

Selection Sort O(n2) Slow, in-placeFor small data sets

Insertion Sort O(n2) WC, ACO(n) BC

Slow, in-placeFor small data sets

Heap Sort O(nlog n) Fast, in-placeFor large data sets

Merge Sort O(nlogn) Fast, sequential data accessFor huge data sets

Page 19: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer

7 4 9 6 2 2 4 6 7 9

4 2 2 4 7 9 7 9

2 2 9 9

Page 20: Sorting algos  > Data Structures & Algorithums

Divide-and-Conquer

Quick-SortQuick-sort is a

randomized sorting algorithm based on the divide-and-conquer paradigm:Divide: pick a random

element x (called pivot) and partition S into L elements less than x E elements equal x G elements greater than x

Recur: sort L and GConquer: join L, E and G

x

x

L GE

x

Page 21: Sorting algos  > Data Structures & Algorithums

QuicksortEfficient sorting algorithm

Discovered by C.A.R. HoareExample of Divide and Conquer algorithmTwo phases

Partition phase Divides the work into half

Sort phase Conquers the halves!

Page 22: Sorting algos  > Data Structures & Algorithums

QuicksortPartition

Choose a pivotFind the position for the pivot so that

all elements to the left are less all elements to the right are greater

< pivot > pivot

pivot

Page 23: Sorting algos  > Data Structures & Algorithums

QuicksortConquer

Apply the same algorithm to each half

< pivot > pivot

pivot< p’ p’ > p’ < p” p” > p”

Page 24: Sorting algos  > Data Structures & Algorithums

78 84 93100 61 8170 65 98 78 55 68

78 84 93100 61 8170 65 98 78 55 68

78 84 93100 61 8170 65 98 78 55 68

78 8493100 61 8170 65 98 78 5568

Page 25: Sorting algos  > Data Structures & Algorithums

78 8493100 61 8170 65 98 78 5568

78 849310061 8170 65 9878 5568

78 849310061 8170 65 9878 5568

78 849310061 8170 65 98785568

78 849310061 8170 65 987855 68

Page 26: Sorting algos  > Data Structures & Algorithums

84

78

9310061 8170 65 987855 68

Page 27: Sorting algos  > Data Structures & Algorithums

QuicksortImplementation

quicksort( void *a, int low, int high ) { int pivot; /* Termination condition! */ if ( high > low ) { pivot = partition( a, low, high ); quicksort( a, low, pivot-1 ); quicksort( a, pivot+1, high ); } }

Divide

Conquer

Page 28: Sorting algos  > Data Structures & Algorithums

Quicksort - Partition

int partition( int *a, int low, int high ) { int left, right; int pivot_item; pivot_item = a[low]; pivot = left = low; right = high; while ( left < right ) { /* Move left while item < pivot */ while( a[left] <= pivot_item ) left++; /* Move right while item > pivot */ while( a[right] >= pivot_item ) right--; if ( left < right ) SWAP(a,left,right); } /* right is final position for the pivot */ a[low] = a[right]; a[right] = pivot_item; return right; }

Page 29: Sorting algos  > Data Structures & Algorithums

Quicksort - Partition

int partition( int *a, int low, int high ) { int left, right; int pivot_item; pivot_item = a[low]; pivot = left = low; right = high; while ( left < right ) { /* Move left while item < pivot */ while( a[left] <= pivot_item ) left++; /* Move right while item > pivot */ while( a[right] >= pivot_item ) right--; if ( left < right ) SWAP(a,left,right); } /* right is final position for the pivot */ a[low] = a[right]; a[right] = pivot_item; return right; }

This exampleuses int’s

to keep thingssimple!

23 12 15 38 42 18 36 29 27

low high

Any item will do as the pivot,choose the leftmost one!

Page 30: Sorting algos  > Data Structures & Algorithums

Quicksort - Partition

int partition( int *a, int low, int high ) { int left, right; int pivot_item; pivot_item = a[low]; pivot = left = low; right = high; while ( left < right ) { /* Move left while item < pivot */ while( a[left] <= pivot_item ) left++; /* Move right while item > pivot */ while( a[right] >= pivot_item ) right--; if ( left < right ) SWAP(a,left,right); } /* right is final position for the pivot */ a[low] = a[right]; a[right] = pivot_item; return right; }

Set left and right markers

23 12 15 38 42 18 36 29 27

low highpivot: 23

left right

Page 31: Sorting algos  > Data Structures & Algorithums

Quicksort - Partitionint partition( int *a, int low, int high ) { int left, right; int pivot_item; pivot_item = a[low]; pivot = left = low; right = high;

while ( left < right ) { /* Move left while item < pivot */ while( a[left] <= pivot_item ) left++; /* Move right while item > pivot */ while( a[right] >= pivot_item ) right--; if ( left < right ) SWAP(a,left,right); } /* right is final position for the pivot */ a[low] = a[right]; a[right] = pivot_item; return right; }

Move the markers until they cross over

23 12 15 38 42 18 36 29 27

low highpivot: 23

left right

Page 32: Sorting algos  > Data Structures & Algorithums

Quicksort - Partitionint partition( int *a, int low, int high ) { int left, right; int pivot_item; pivot_item = a[low]; pivot = left = low; right = high;

while ( left < right ) { /* Move left while item < pivot */ while( a[left] <= pivot_item ) left++; /* Move right while item > pivot */ while( a[right] >= pivot_item ) right--; if ( left < right ) SWAP(a,left,right); } /* right is final position for the pivot */ a[low] = a[right]; a[right] = pivot_item; return right; }

Move the left pointer whileit points to items <= pivot

23 12 15 38 42 18 36 29 27

low highpivot: 23

left right Move right similarly

Page 33: Sorting algos  > Data Structures & Algorithums

Quicksort - Partition

int partition( int *a, int low, int high ) { int left, right; int pivot_item; pivot_item = a[low]; pivot = left = low; right = high;

while ( left < right ) { /* Move left while item < pivot */

while( a[left] <= pivot_item ) left++; /* Move right while item > pivot */

while( a[right] >= pivot_item ) right--; if ( left < right ) SWAP(a,left,right); } /* right is final position for the pivot */ a[low] = a[right]; a[right] = pivot_item; return right; }

Swap the two itemson the wrong side of the pivot

23 12 15 38 42 18 36 29 27

low high

pivot: 23

left right

Page 34: Sorting algos  > Data Structures & Algorithums

Quicksort - Partitionint partition( int *a, int low, int high ) { int left, right; int pivot_item; pivot_item = a[low]; pivot = left = low; right = high;

while ( left < right ) { /* Move left while item < pivot */

while( a[left] <= pivot_item ) left++; /* Move right while item > pivot */

while( a[right] >= pivot_item ) right--; if ( left < right ) SWAP(a,left,right); } /* right is final position for the pivot */ a[low] = a[right]; a[right] = pivot_item; return right; }

left and right have swapped over,

so stop

23 12 15 18 42 38 36 29 27

low highpivot: 23

leftright

Page 35: Sorting algos  > Data Structures & Algorithums

Quicksort - Partition

int partition( int *a, int low, int high ) { int left, right; int pivot_item; pivot_item = a[low]; pivot = left = low; right = high;

while ( left < right ) { /* Move left while item < pivot */

while( a[left] <= pivot_item ) left++; /* Move right while item > pivot */

while( a[right] >= pivot_item ) right--; if ( left < right ) SWAP(a,left,right); } /* right is final position for the pivot */ a[low] = a[right]; a[right] = pivot_item; return right; }

Finally, swap the pivotand right

23 12 15 18 42 38 36 29 27

low highpivot: 23

leftright

Page 36: Sorting algos  > Data Structures & Algorithums

Quicksort - Partition

int partition( int *a, int low, int high ) { int left, right; int pivot_item; pivot_item = a[low]; pivot = left = low; right = high;

while ( left < right ) { /* Move left while item < pivot */

while( a[left] <= pivot_item ) left++; /* Move right while item > pivot */

while( a[right] >= pivot_item ) right--; if ( left < right ) SWAP(a,left,right); } /* right is final position for the pivot */ a[low] = a[right]; a[right] = pivot_item; return right; }

Return the positionof the pivot

18 12 15 23 42 38 36 29 27

low high

pivot: 23

right

Page 37: Sorting algos  > Data Structures & Algorithums

Quicksort - Conquerpivot

18 12 15 23 42 38 36 29 27

pivot: 23

Recursivelysort left half Recursively

sort right half

Page 38: Sorting algos  > Data Structures & Algorithums
Page 39: Sorting algos  > Data Structures & Algorithums

Why study Heapsort?It is a well-known, traditional sorting

algorithm you will be expected to knowHeapsort is always O(n log n)

Quicksort is usually O(n log n) but in the worst case slows to O(n2)

Quicksort is generally faster, but Heapsort is better in time-critical applications

Heapsort is a really cool algorithm!

Page 40: Sorting algos  > Data Structures & Algorithums

What is a “heap”? Definitions of heap:

1. A large area of memory from which the programmer can allocate blocks as needed, and deallocate them (or allow them to be garbage collected) when no longer needed

2. A balanced, left-justified binary tree in which no node has a value greater than the value in its parent

These two definitions have little in common

Heapsort uses the second definition

Page 41: Sorting algos  > Data Structures & Algorithums

Balanced binary treesRecall:

The depth of a node is its distance from the rootThe depth of a tree is the depth of the deepest node

A binary tree of depth n is balanced if all the nodes at depths 0 through n have two children

Balanced Balanced Not balanced

n-2n-1n

Page 42: Sorting algos  > Data Structures & Algorithums

Left-justified binary treesA balanced binary tree is left-justified

if:all the leaves are at the same depth, orall the leaves at depth n+1 are to the left

of all the nodes at depth n

Left-justified Not left-justified

Page 43: Sorting algos  > Data Structures & Algorithums

Plan of attackFirst, we will learn how to turn a binary tree

into a heapNext, we will learn how to turn a binary tree

back into a heap after it has been changed in a certain way

Finally (this is the cool part) we will see how to use these ideas to sort an array

Page 44: Sorting algos  > Data Structures & Algorithums

The heap propertyA node has the heap property if the value

in the node is as large as or larger than the values in its children

All leaf nodes automatically have the heap property

A binary tree is a heap if all nodes in it have the heap property

12

8 3

Blue node has heap property

12

8 12

Blue node has heap property

12

8 14

Blue node does not have heap property

Page 45: Sorting algos  > Data Structures & Algorithums

shiftUpGiven a node that does not have the heap

property, you can give it the heap property by exchanging its value with the value of the larger child

This is sometimes called shifting upNotice that the child may have lost the heap

property

14

8 12

Blue node has heap property

12

8 14

Blue node does not have heap property

Page 46: Sorting algos  > Data Structures & Algorithums

Constructing a heap IA tree consisting of a single node is

automatically a heapWe construct a heap by adding nodes one

at a time:Add the node just to the right of the

rightmost node in the deepest levelIf the deepest level is full, start a new level

Examples: Add a new node here

Add a new node here

Page 47: Sorting algos  > Data Structures & Algorithums

Constructing a heap IIEach time we add a node, we may destroy the heap

property of its parent nodeTo fix this, we sift upBut each time we sift up, the value of the topmost

node in the shift may increase, and this may destroy the heap property of its parent node

We repeat the shifting up process, moving up in the tree, until eitherWe reach nodes whose values don’t need to be

swapped (because the parent is still larger than both children), or

We reach the root

Page 48: Sorting algos  > Data Structures & Algorithums

8 8

10

10

8

10

8 5

10

8 5

12

10

12 5

8

12

10 5

8

1 2 3

4

Page 49: Sorting algos  > Data Structures & Algorithums

Other children are not affected

The node containing 8 is not affected because its parent gets larger, not smaller

The node containing 5 is not affected because its parent gets larger, not smaller

The node containing 8 is still not affected because, although its parent got smaller, its parent is still greater than it was originally

12

10 5

8 14

12

14 5

8 10

14

12 5

8 10

Page 50: Sorting algos  > Data Structures & Algorithums

A sample heapHere’s a sample binary tree after it has been

heapified

Notice that heapified does not mean sortedHeapifying does not change the shape of the

binary tree; this binary tree is balanced and left-justified because it started out that way

19

1418

22

321

14

119

15

25

1722

Page 51: Sorting algos  > Data Structures & Algorithums

Removing the rootNotice that the largest number is now in the rootSuppose we discard the root:

How can we fix the binary tree so it is once again balanced and left-justified?

Solution: remove the rightmost leaf at the deepest level and use it for the new root

19

1418

22

321

14

119

15

1722

11

Page 52: Sorting algos  > Data Structures & Algorithums

The reHeap method IOur tree is balanced and left-justified, but no longer a heapHowever, only the root lacks the heap property

We can shiftUp() the rootAfter doing this, one and only one of its

children may have lost the heap property

19

1418

22

321

14

9

15

1722

11

Page 53: Sorting algos  > Data Structures & Algorithums

The reHeap method IINow the left child of the root (still the number

11) lacks the heap property

We can shiftUp() this nodeAfter doing this, one and only one of its

children may have lost the heap property

19

1418

22

321

14

9

15

1711

22

Page 54: Sorting algos  > Data Structures & Algorithums

The reHeap method IIINow the right child of the left child of the root

(still the number 11) lacks the heap property:

We can shiftUp() this nodeAfter doing this, one and only one of its children

may have lost the heap property —but it doesn’t, because it’s a leaf

19

1418

11

321

14

9

15

1722

22

Page 55: Sorting algos  > Data Structures & Algorithums

The reHeap method IVOur tree is once again a heap, because every

node in it has the heap property

Once again, the largest (or a largest) value is in the rootWe can repeat this process until the tree becomes emptyThis produces a sequence of values in order largest to

smallest

19

1418

21

311

14

9

15

1722

22

Page 56: Sorting algos  > Data Structures & Algorithums

SortingWhat do heaps have to do with sorting an array?Here’s the neat part:

Because the binary tree is balanced and left justified, it can be represented as an array

All our operations on binary trees can be represented as operations on arrays

To sort: heapify the array; while the array isn’t empty { remove and replace the root; reheap the new root node;

}

Page 57: Sorting algos  > Data Structures & Algorithums

Mapping into an array

Notice:The left child of index i is at index 2*i+1The right child of index i is at index 2*i+2

Example: the children of node 3 (19) are 7 (18) and 8 (14)

19

1418

22

321

14

119

15

25

1722

25 22 17 19 22 14 15 18 14 21 3 9 11

0 1 2 3 4 5 6 7 8 9 10 11 12

Page 58: Sorting algos  > Data Structures & Algorithums

Removing and replacing the rootThe “root” is the first element in the arrayThe “rightmost node at the deepest level” is the last

elementSwap them...

...And pretend that the last element in the array no longer exists—that is, the “last index” is 11 (9)

25 22 17 19 22 14 15 18 14 21 3 9 11

0 1 2 3 4 5 6 7 8 9 10 11 12

11 22 17 19 22 14 15 18 14 21 3 9 25

0 1 2 3 4 5 6 7 8 9 10 11 12

Page 59: Sorting algos  > Data Structures & Algorithums

Reheap and repeatReheap the root node (index 0, containing 11)...

...And again, remove and replace the root nodeRemember, though, that the “last” array index is changedRepeat until the last becomes first, and the array is sorted!

22 22 17 19 21 14 15 18 14 11 3 9 25

0 1 2 3 4 5 6 7 8 9 10 11 12

9 22 17 19 22 14 15 18 14 21 3 22 25

0 1 2 3 4 5 6 7 8 9 10 11 12

11 22 17 19 22 14 15 18 14 21 3 9 25

0 1 2 3 4 5 6 7 8 9 10 11 12

Page 60: Sorting algos  > Data Structures & Algorithums

Analysis IHere’s how the algorithm starts:

heapify the array;

Heapifying the array: we add each of n nodes Each node has to be shifted up, possibly as

far as the root Since the binary tree is perfectly balanced, sifting

up a single node takes O(log n) timeSince we do this n times, heapifying takes

n*O(log n) time, that is, O(n log n) time

Page 61: Sorting algos  > Data Structures & Algorithums

Analysis IIHere’s the rest of the algorithm:

while the array isn’t empty { remove and replace the root; reheap the new root node;

}

We do the while loop n times (actually, n-1 times), because we remove one of the n nodes each time

Removing and replacing the root takes O(1) time

Therefore, the total time is n times however long it takes the reheap method

Page 62: Sorting algos  > Data Structures & Algorithums

Analysis IIITo reheap the root node, we have to follow

one path from the root to a leaf node (and we might stop before we reach a leaf)

The binary tree is perfectly balancedTherefore, this path is O(log n) long

And we only do O(1) operations at each nodeTherefore, reheaping takes O(log n) times

Since we reheap inside a while loop that we do n times, the total time for the while loop is n*O(log n), or O(n log n)

Page 63: Sorting algos  > Data Structures & Algorithums

Analysis IVHere’s the algorithm again:

heapify the array; while the array isn’t empty { remove and replace the root; reheap the new root node;

}

We have seen that heapifying takes O(n log n) time

The while loop takes O(n log n) timeThe total time is therefore O(n log n) + O(n log

n)This is the same as O(n log n) time

Page 64: Sorting algos  > Data Structures & Algorithums