i206: lecture 8: sorting and analysis of algorithms

18
1 i206: Lecture 8: Sorting and Analysis of Algorithms Marti Hearst Spring 2012

Upload: frye

Post on 13-Jan-2016

38 views

Category:

Documents


4 download

DESCRIPTION

i206: Lecture 8: Sorting and Analysis of Algorithms. Marti Hearst Spring 2012. Definition of Big-Oh A running time is O (g( n) ) if there exist constants n 0 > 0 and c > 0 such that for all problem sizes n > n 0 , the running time for a problem of size n is at most c( g (n)) . - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: i206: Lecture 8: Sorting and Analysis of Algorithms

1

i206: Lecture 8:Sorting and Analysis of Algorithms

Marti HearstSpring 2012

Page 2: i206: Lecture 8: Sorting and Analysis of Algorithms

2

Definition of Big-Oh

A running time is O(g(n)) if there exist constants n0 > 0

and c > 0 such that for all problem sizes n > n0, the

running time for a problem of size n is at most c(g(n)).

In other words, c(g(n)) is an upper bound on the running time for sufficiently

large n.

http://www.cs.dartmouth.edu/~farid/teaching/cs15/cs5/lectures/0519/0519.html

c g(n)

Example: the function f(n)=8n–2 is O(n)The running time of algorithm arrayMax is O(n)

Page 3: i206: Lecture 8: Sorting and Analysis of Algorithms

3

More formally• Let f(n) and g(n) be functions mapping

nonnegative integers to real numbers.

• f(n) is (g(n)) if there exist positive constants n0 and c such that for all n>=n0, f(n) <= c*g(n)

• Other ways to say this:f(n) is order g(n)

f(n) is big-Oh of g(n) f(n) is Oh of g(n)

f(n) O(g(n)) (set notation)

Page 4: i206: Lecture 8: Sorting and Analysis of Algorithms

4

Analysis Example: Phonebook• Given:

– A physical phone book• Organized in alphabetical order

– A name you want to look up– An algorithm in which you search through the book

sequentially, from first page to last– What is the order of:

• The best case running time?• The worst case running time?• The average case running time?

– What is:• A better algorithm?• The worst case running time for this algorithm?

Page 5: i206: Lecture 8: Sorting and Analysis of Algorithms

5

Analysis Example (Phonebook)• This better algorithm is called Binary Search

• What is its running time?– First you look in the middle of n elements– Then you look in the middle of n/2 = ½*n elements

– Then you look in the middle of ½ * ½*n elements– …– Continue until there is only 1 element left– Say you did this m times: ½ * ½ * ½* …*n– Then the number of repetitions is the smallest integer m such that

12

1n

m

Page 6: i206: Lecture 8: Sorting and Analysis of Algorithms

6

Analyzing Binary Search

– In the worst case, the number of repetitions is the smallest integer m such that

– We can rewrite this as follows:

mn

n

n

m

m

log

2

12

1

Multiply both sides by m2

Take the log of both sides

Since m is the worst case time, the algorithm is O(logn)

12

1n

m

Page 7: i206: Lecture 8: Sorting and Analysis of Algorithms

7

Sorting

www.naturalbrands.com

Page 8: i206: Lecture 8: Sorting and Analysis of Algorithms

8

Sorting

Putting collections of things in order– Numerical order– Alphabetical order– An order defined by the domain of use

• E.g. color, yumminess …

http://www.hugkiss.com/platinum/candy.shtml

Page 9: i206: Lecture 8: Sorting and Analysis of Algorithms

9

Bubble Sort O(n2)

http://www.youtube.com/watch?v=P00xJgWzz2c

The dark gray represents a comparison and the white represents a swap.

Page 10: i206: Lecture 8: Sorting and Analysis of Algorithms

1010

Insertion Sort O(n2)

Brookshear Figure 5.19

Brookshear Figure 5.11

http://xoax.net/comp/sci/algorithms/Lesson2.php

Page 11: i206: Lecture 8: Sorting and Analysis of Algorithms

11

Merge Sort O(n log n)

https://en.wikipedia.org/wiki/Mergesort

Page 12: i206: Lecture 8: Sorting and Analysis of Algorithms

12

Merge Sort O(n log n)

http://xoax.net/comp/sci/algorithms/Lesson3.php

Page 13: i206: Lecture 8: Sorting and Analysis of Algorithms

13

Merge Sort O(n log n)

http://xoax.net/comp/sci/algorithms/Lesson2.php

Page 14: i206: Lecture 8: Sorting and Analysis of Algorithms

14

Merge Sort O(n log n)

http://xoax.net/comp/sci/algorithms/Lesson2.php

Page 15: i206: Lecture 8: Sorting and Analysis of Algorithms

15

Merge Sort O(n log n)

An average and worst-case performance of O(n log n).

Based on how the algorithm works, the recurrence T(n) = 2T(n/2) + n = n log n describes the average running time.

(Apply the algorithm to two lists of half the size of the original list, and add the n steps taken to merge the resulting two lists.)

In the worst case, the number of comparisons merge sort makes is equal to or slightly smaller than (n log n - 2log n + 1), which is between (n log n - n + 1) and (n log n + n + O(lg n)).

Page 16: i206: Lecture 8: Sorting and Analysis of Algorithms

16

Substrings of a String

Say you have a long string, such as "supercalifragilisticexpialidocious".

Say you want to print out all the substrings of this string, starting from the left, and always retaining the substring you have printed out so far, e.g. "s", "su", "sup", "supe", and so on.

Assuming this string is of length n, how many substrings will you print out in terms of big-O notation?

If you printed out all possible substrings, starting at any location, and of any length, how many are there?

Page 17: i206: Lecture 8: Sorting and Analysis of Algorithms

17

Traveling Salesman Problem

http://xkcd.com/399/

Page 18: i206: Lecture 8: Sorting and Analysis of Algorithms

18

Summary: Analysis of Algorithms• A method for determining, in an abstract way, the asymptotic running time of an algorithm– Here asymptotic means as n gets very large

• Useful for comparing algorithms• Useful also for determing tractability

– Meaning, a way to determine if the problem is intractable (impossible) or not

– Exponential time algorithms are usually intractable.

• We’ll revisit these ideas throughout the rest of the course.