the efficiency of algorithms

28
The Efficiency of Algorithms Chapter 7

Upload: chase-lee

Post on 15-Mar-2016

38 views

Category:

Documents


0 download

DESCRIPTION

The Efficiency of Algorithms. Chapter 7. Chapter Contents. Motivation Measuring an Algorithm's Efficiency Big Oh Notation Formalities Picturing Efficiency The Efficiency of Implementations of the ADT List The Array-Based Implementation The Linked Implementation Comparing Implementations. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: The Efficiency of Algorithms

The Efficiency of Algorithms

Chapter 7

Page 2: The Efficiency of Algorithms

2

Chapter Contents

MotivationMeasuring an Algorithm's Efficiency• Big Oh Notation

FormalitiesPicturing EfficiencyThe Efficiency of Implementations of the ADT List• The Array-Based Implementation• The Linked Implementation• Comparing Implementations

Page 3: The Efficiency of Algorithms

3

MotivationEven a simple Multiplication program can be noticeably inefficient: 7562 x 423. Multiplication is equivalent to repeated addition.

long firstOperand = 7562;long secondOperand = 423;long product = 0;for (; secondOperand > 0; secondOperand--)

product = product + firstOperand;System.out.println(product);

Page 4: The Efficiency of Algorithms

4

Motivation

When the 423 is changed to 100,000,000 there is a significant delay in seeing the resultWhen the 423 is changed to 1000,000,000, a even longer delay is noticed.• You may want to add one billion to zero 7562

times• May not always be faster, since one operand

is not always much smaller than the other.

Page 5: The Efficiency of Algorithms

5

Motivation756*423 = 7562 * ( 400 + 20 + 3)

=7562 * 400 + 7562 * 20 + 7562 * 3= 756,200 * 4 + 75,620 * 2 + 7,562 * 3

Each of the term is the sum of d integers, where d is a digit in the second operand.This code executes faster than the previous code. Even simple program can be very inefficient depending on the implementation. Convinced that program’s efficiency does matter.

Page 6: The Efficiency of Algorithms

6

Measuring Algorithm EfficiencyAlgorithm has both time and space requirements called complexity to measureTypes of complexity• Space complexity• Time complexity

Analysis of algorithms• The measuring of either time/space complexity of an algorithm

Measure the time complexity since it is more importantCannot compute actual time for an algorithm. Give function of problem size that is directly proportional to time requirement: growth-rate functionFunction measures how the time requirement grows as the problem size grows. • We usually measure worst-case time

Page 7: The Efficiency of Algorithms

7

Measuring Algorithm Efficiency

Three algorithms for computing 1 + 2 + … n for an integer n > 0

Page 8: The Efficiency of Algorithms

8

Measuring Algorithm Efficiency

The number of operations required by the algorithms

Page 9: The Efficiency of Algorithms

9

Measuring Algorithm Efficiency

The number of operations required by the algorithms as a function of n

Page 10: The Efficiency of Algorithms

10

Big Oh Notation

Computer scientists use a notation to represent an algorithm’s complexity.To say "Algorithm A has a worst-case time requirement proportional to n"• We say A is O(n)• Read "Big Oh of n" or “order of at most n”

For the other two algorithms• Algorithm B is O(n2)• Algorithm C is O(1)

Page 11: The Efficiency of Algorithms

11

Big Oh Notation

Tabulates magnitudes of typical growth-rate functions evaluated at increasing values of n

When analyzing the time efficiency of an algorithm, consider larger problems. For small problems, the difference between the execution time is usually insignificant.

Grows in magnitude from left to right…

Page 12: The Efficiency of Algorithms

12

Big Oh Notation

The number of digits in an integer n compared with the integer portion of log10n

O(loga n) = O(logb n)

Page 13: The Efficiency of Algorithms

13

FormalitiesFormal mathematical definition of Big OhAn algorithm's time requirement f(n) is of order at most g(n)Big Oh provides an upper bound on a function’s growth rate.

f(n) = O(g(n)) • That is, if a positive real number c and positive

integer N exist such thatf(n) ≤ c•g(n) for all n ≥ N

c•g(n) is the upper bound on f(n) when n is sufficiently large.

Page 14: The Efficiency of Algorithms

14

Formalities

An illustration of the definition of Big Oh

Page 15: The Efficiency of Algorithms

15

ExampleShow that f(n) = 5*n + 3 = O(n)

g(n) = n, c = 6, and N = 3f(n) <= 6 g(n)

Why don’t we let g(n) = n^2 ? Let g(n) = n^2, c=8, N =1 Although the conclusion is correct, it is not as tight as possible. You want the upper bound to be as small as possible, and you want it to involve simple functions.

Page 16: The Efficiency of Algorithms

16

Formalities

The following identities hold for Big Oh notation:• O(k * f(n)) = O(f(n))• O(f(n)) + O(g(n)) = O(f(n) + g(n))• O(f(n)) * O(g(n)) = O(f(n) *g(n))

• By using these identities and ignoring smaller terms in a growth rate function, you can determine the order of complexity with little efforts.

• O(4*n^2 + 50*n -10) = O(4*n^2) = O(n^2)

Page 17: The Efficiency of Algorithms

17

Picturing Efficiency

Body of loop requires a constant amount of time O(1)

an O(n) algorithm.

Page 18: The Efficiency of Algorithms

18

Picturing Efficiency

An O(n2) algorithm.

Page 19: The Efficiency of Algorithms

19

Picturing Efficiency

Another O(n2) algorithm.

Page 20: The Efficiency of Algorithms

20

Question?

for i = 1 to n{

for j = 1 to 5 sum = sum +1;

}

Using Gig Oh notation, what is the order of the computation time?

Page 21: The Efficiency of Algorithms

21

Get a Feel for Growth-rate Functions

The effect of doubling the problem size on an algorithm's time requirement.

Page 22: The Efficiency of Algorithms

22

Get a Feel for Growth-rate Functions

The time to process one million of problem size by algorithms of various orders at the rate of one

million operations per second.

Page 23: The Efficiency of Algorithms

23

Comments on Efficiency

A programmer can use O(n2), O(n3) or O(2n) as long as the problem size is smallAt one million operations per second it would take 1 second …• For a problem size of 1000 with O(n2) • For a problem size of 100 with O(n3) • For a problem size of 20 with O(2n)

Page 24: The Efficiency of Algorithms

24

Efficiency of Implementations of ADT List

For array-based implementation• Add to end of list O(1)• Add to list at given position O(n)

For linked implementation• Add to end of list O(n)/O(1)• Add to list at given position O(n)• Retrieving an entry O(n)

Page 25: The Efficiency of Algorithms

25

Comparing Implementations

The time efficiencies of the ADT list operations for two implementations, expressed in Big Oh notation

Page 26: The Efficiency of Algorithms

26

Choose Implementation for ADTConsider the operations that your application requiresA particular operation frequently, its implementation has to be efficient. Conversely, rarely use an operation, you can afford to use one that has an inefficient implementation.

Page 27: The Efficiency of Algorithms

27

ExercisesUsing Big Oh notation, indicate the time requirement of each of the following tasks inthe worst case. Describe any assumptions that you make.a. After arriving at a party, you shake hands with each person there.b. Each person in a room shakes hands with everyone else in the room.c. You climb a flight of stairs.d. You slide down the banister.e. After entering an elevator, you press a button to choose a floor.f. You ride the elevator from the ground floor up to the nth floor.g. You read a book twice.

Page 28: The Efficiency of Algorithms

28

ExercisesSuppose that your implementation of a particular algorithm appears in Java as follows:for (int pass = 1; pass <= n; pass++){

• for (int index = 0; index < n; index++)• {

• for (int count = 1; count < 10; count++)• {

. . .• } // end for

• } // end for} // end forThe algorithm involves an array of n items. The previous code shows the only repetition in the algorithm, but it does not show the computations that occur within the loops.These computations, however, are independent of n. What is the order of the algorithm?