2-1 2 algorithms principles. efficiency. complexity. o-notation. recursion. © 2001, d.a. watt and...

46
2-1 2 Algorithms • Principles. • Efficiency. • Complexity. O-notation. • Recursion. © 2001, D.A. Watt and D.F. Brown

Upload: anis-horton

Post on 31-Dec-2015

232 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-1

2Algorithms

• Principles.

• Efficiency.

• Complexity.

• O-notation.

• Recursion.

© 2001, D.A. Watt and D.F. Brown

Page 2: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-2

Principles (1)

• An algorithm is a step-by-step procedure for solving a stated problem.

• The algorithm will be performed by a processor (which may be human, mechanical, or electronic).

• The algorithm must be expressed in steps that the processor is capable of performing.

• The algorithm must eventually terminate.

Page 3: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-3

Principles (2)

• The algorithm must be expressed in some language that the processor “understands”. (But the underlying procedure is independent of the particular language chosen.)

• The stated problem must be solvable, i.e., capable of solution by a step-by-step procedure.

Page 4: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-4

Efficiency

• Given several algorithms to solve the same problem, which algorithm is “best”?

• Given an algorithm, is it feasible to use it at all? In other words, is it efficient enough to be usable in practice?

• How much time does the algorithm require?

• How much space (memory) does the algorithm require?

• In general, both time and space requirements depend on the algorithm’s input (typically the “size” of the input).

Page 5: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-5

Example 1: efficiency

• Hypothetical profile of two sorting algorithms:

• Algorithm B’s time grows more slowly than A’s.

items to be sorted, n 10

1

2

3

time (ms)

Key:

Algorithm A

Algorithm B

0

4

20 30 40

Page 6: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-6

Efficiency: measuring time

• Measure time in seconds?

+ is useful in practice

– depends on language, compiler, and processor.

• Count algorithm steps?

+ does not depend on compiler or processor

– depends on granularity of steps.

• Count characteristic operations? (e.g., arithmetic ops in math algorithms, comparisons in searching algorithms)

+ depends only on the algorithm itself

+ measures the algorithm’s intrinsic efficiency.

Page 7: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-7

Remedial Mathematics

Powers

• Consider a number b and a non-negative integer n. Then b to the power of n (written bn) is the multiplication of n copies of b:

bn = b … b

E.g.: b3 = b b bb2 = b bb1 = bb0 = 1

Page 8: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-8

Remedial Mathematics

Logarithms

• Consider a positive number y. Then the logarithm of y to the base 2 (written log2 y) is the number of copies of 2 that must be multiplied together to equal y.

• If y is a power of 2, log2 y is an integer:

E.g.: log2 1 = 0log2 2 = 1log2 4 = 2log2 8 = 3

since 20 = 1

since 21 = 2

since 22 = 4

since 23 = 8

• If y is not a power of 2, log2 y is fractional:

E.g.: log2 5 2.32log2 7 2.81

Page 9: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-9

Remedial Mathematics

Logarithm laws

• log2 (2n) = n

• log2 (xy) = log2 x + log2 y

• log2 (x/y) = log2 x – log2 y

since the no. of 2s multiplied to make xy is the sum of the no. of 2s multiplied to make x and the no. of 2s multiplied to make y

Page 10: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-10

Remedial Mathematics

Logarithms example (1)

• How many times must we halve the value of n (discarding any remainders) to reach 1?

• Suppose that n is a power of 2:

E.g.: 8 4 2 1 (8 must be halved 3 times)16 8 4 2 1 (16 must be halved 4 times)

If n = 2m, n must be halved m times.

• Suppose that n is not a power of 2:

E.g.: 9 4 2 1 (9 must be halved 3 times)15 7 3 1 (15 must be halved 3 times)

If 2m < n < 2m+1, n must be halved m times.

Page 11: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-11

Remedial Mathematics

Logarithms example (2)

• In general, n must be halved m times if:

2m n < 2m+1

i.e., log2 (2m) log2 n < log2 (2m+1)i.e., m log2 n < m+1i.e., m = floor(log2 n).

• The floor of x (written floor(x) or x) is the largest integer not greater than x.

• Conclusion: n must be halved floor(log2n) times to reach 1.

• Also: n must be halved floor(log2n)+1 times to reach 0.

Page 12: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-12

Example 2: power algorithms (1)

• Simple power algorithm:

To compute bn:

1. Set p to 1.2. For i = 1, …, n, repeat:

2.1. Multiply p by b.3. Terminate with answer p.

Page 13: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-13

Example 2 (2)

• Analysis (counting multiplications):

Step 2.1 performs a multiplication.This step is repeated n times.

No. of multiplications = n

Page 14: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-14

Example 2 (3)

• Implementation in Java:

static int power (int b, int n) {// Return bn (where n is non-negative).

int p = 1;for (int i = 1; i <= n; i++)

p *= b;return p;

}

Page 15: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-15

Example 2 (4)

• Idea: b1000 = b500 b500. If we know b500, we can compute b1000 with only 1 more multiplication!

• Smart power algorithm:

To compute bn:

1. Set p to 1, set q to b, and set m to n.2. While m > 0, repeat:

2.1. If m is odd, multiply p by q.2.2. Halve m (discarding any remainder). 2.3. Multiply q by itself.

3. Terminate with answer p.

Page 16: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-16

Example 2 (5)

• Analysis (counting multiplications):

Steps 2.1–3 together perform at most 2 multiplications.They are repeated as often as we must halve the value of n (discarding any remainder) until it reaches 0, i.e., floor(log2 n) + 1 times.

Max. no. of multiplications = 2(floor(log2 n) + 1)= 2 floor(log2 n) + 2

Page 17: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-17

Example 2 (6)

• Implementation in Java:

static int power (int b, int n) {// Return bn (where n is non-negative).

int p = 1, q = b, m = n;while (m > 0) {

if (m%2 != 0) p *= q;m /= 2; q *= q;

}return p;

}

Page 18: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-18

10 20 30 40

multiplications

10

20

30

40

50

5000

n

Example 2 (7)

• Comparison:simple power algorithm

smart power algorithm

Page 19: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-19

Complexity

• For many interesting algorithms, the exact number of operations is too difficult to analyse mathematically.

• To simplify the analysis: identify the fastest-growing term

neglect slower-growing terms

neglect the constant factor in the fastest-growing term.

• The resulting formula is the algorithm’s time complexity. It focuses on the growth rate of the algorithm’s time requirement.

• Similarly for space complexity.

Page 20: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-20

Example 3: analysis of power algorithms (1)

• Analysis of simple power algorithm(counting multiplications):

No. of multiplications = n

Time taken is approximately proportional to n.

Time complexity is of order n. This is written O(n).

Page 21: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-21

Example 3 (2)

• Analysis of smart power algorithm (counting multiplic’ns):

Max. no. of multiplications = 2 floor(log2 n) + 2

then to log2 n

Time complexity is of order log n.This is written O(log n).

then to floor(log2 n)

Simplify to 2 floor(log2 n)

Neglect slow-growing term, +2.

Neglect constant factor, 2.

Neglect floor(), which on average subtracts 0.5, a constant term.

Page 22: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-22

10 20 30 40n

10

20

30

40

50

0

500

Example 3 (3)

• Comparison: n

log n

Page 23: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-23

O-notation (1)

• We have seen that an O(log n) algorithm is inherently better than an O(n) algorithm for large values of n. O(log n) signifies a slower growth rate than O(n).

• Complexity O(X) means “of order X”, i.e., growing proportionally to X.

Here X signifies the growth rate, neglecting slower-growing terms and constant factors.

Page 24: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-24

O-notation (2)

• Common time complexities:

O(1) constant time (feasible)

O(log n) logarithmic time (feasible)

O(n) linear time (feasible)

O(n log n) log linear time (feasible)

O(n2) quadratic time (sometimes feasible)

O(n3) cubic time (sometimes feasible)

O(2n) exponential time (rarely feasible)

Page 25: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-25

Growth rates (1)

• Comparison of growth rates:

1 1 1 1 1

log n 3.3 4.3 4.9 5.3

n 10 20 30 40

n log n 33 86 147 213

n2 100 400 900 1,600

n3 1,000 8,000 27,000 64,000

2n 1,024 1.0 million

1.1 billion 1.1 trillion

Page 26: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-26

10 20 30 40

20

40

60

0

80

100

500n

Growth rates (2)

• Graphically:

log n

n

n log nn22n

Page 27: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-27

Example 4: growth rates (1)

• Consider a problem that requires n data items to be processed.

• Consider several competing algorithms to solve this problem. Suppose that their time requirements on a particular processor are as follows:

Algorithm Log: 0.3 log2 n secondsAlgorithm Lin: 0.1 n secondsAlgorithm LogLin: 0.03 n log2 n secondsAlgorithm Quad: 0.01 n2 secondsAlgorithm Cub: 0.001 n3 secondsAlgorithm Exp: 0.0001 2n seconds

Page 28: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-28

Log

Lin

LogLin

Quad

Cub

Exp

10 20 30 40 50 60 70 80 90 1000

0:00

n

0:01

Example 4 (2)

• Compare how many data items (n) each algorithm can process in 1, 2, …, 10 seconds:

0:020:030:040:050:060:070:080:090:10

Page 29: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-29

Recursion

• A recursive algorithm is one expressed in terms of itself. In other words, at least one step of a recursive algorithm is a “call” to itself.

• In Java, a recursive method is one that calls itself.

Page 30: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-30

When should recursion be used?

• Sometimes an algorithm can be expressed using either iteration or recursion. The recursive version tends to be:

+ more elegant and easier to understand

– less efficient (extra calls consume time and space).

• Sometimes an algorithm can be expressed only using recursion.

Page 31: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-31

When does recursion work?

• Given a recursive algorithm, how can we sure that it terminates?

• The algorithm must have: one or more “easy” cases

one or more “hard” cases.

• In an “easy” case, the algorithm must give a direct answer without calling itself.

• In a “hard” case, the algorithm may call itself, but only to deal with an “easier” case.

Page 32: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-32

Example 5: recursive power algorithms (1)

• Recursive definition of bn :

bn = 1 if n = 0bn = b bn–1 if n > 0

• Simple recursive power algorithm:

To compute bn:

1. If n = 0:1.1. Terminate with answer 1.

2. If n > 0:2.1. Terminate with answer b bn–1.

Easy case: solved directly.

Hard case: solved by comput-ing bn–1, which is easier since n–1 is closer than n to 0.

Page 33: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-33

Example 5 (2)

• Implementation in Java:

static int power (int b, int n) {// Return bn (where n is non-negative).

if (n == 0)return 1;

elsereturn b * power(b, n-1);

}

Page 34: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-34

Example 5 (3)

• Idea: b1000 = b500 b500, and b1001 = b b500 b500.

• Alternative recursive definition of bn:

bn = 1 if n = 0bn = bn/2 bn/2 if n > 0 and n is evenbn = b bn/2 bn/2 if n > 0 and n is odd

(Recall: n/2 discards the remainder if n is odd.)

Page 35: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-35

Example 5 (4)

• Smart recursive power algorithm:

To compute bn:

1. If n = 0:1.1. Terminate with answer 1.

2. If n > 0:2.1. Let p be bn/2.2.2. If n is even:

2.2.1. Terminate with answer p p.2.3. If n is odd:

2.3.1. Terminate with answer b p p.

Easy case: solved directly.

Hard case: solved by comput-ing bn/2, which is easier since n/2 is closer than n to 0.

Page 36: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-36

Example 5 (5)

• Implementation in Java:

static int power (int b, int n) {// Return bn (where n is non-negative).

if (n == 0)return 1;

else {int p = power(b, n/2);if (n % 2 == 0) return p * p;else return b * p * p;

}}

Page 37: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-37

Example 5 (6)

• Analysis (counting multiplications):

Each recursive power algorithm performs the same number of multiplications as the corresponding non-recursive algorithm. So their time complexities are the same:

non-recursive recursive

Simple power algorithm O(n) O(n)

Smart power algorithm O(log n) O(log n)

Page 38: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-38

Example 5 (7)

• Analysis (space):

The non-recursive power algorithms use constant space, i.e., O(1).

A recursive algorithm uses extra space for each recursive call. The simple recursive power algorithm calls itself n times before returning, whereas the smart recursive power algorithm calls itself floor(log2 n) times.

non-recursive recursive

Simple power algorithm O(1) O(n)

Smart power algorithm O(1) O(log n)

Page 39: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-39

Example 6: rendering integers (1)

• Rendering a value means computing a string representation of it (suitable for printing or display).

• Problem: Render a given integer i to the base r (where r is in the range 2…10). E.g.:

i r rendering

+29 2 “11101”

+29 8 “35”

–29 8 “–35”

+29 10 “29”

Page 40: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-40

Example 6 (2)

• Recursive integer rendering algorithm:

To render the integer i to the base r:

1. If i < 0:1.1. Render the character ‘–’.1.2. Render (–i) to the base r.

2. If 0 i < r:2.1. Let d be the digit corresponding to i.2.2. Render the character d.

3. If i ≥ r:3.1. Let d be the digit corresponding to

(i modulo r).3.2. Render (i/r) to the base r.3.3. Render the character d.

4. Terminate.

Hard case: solved by rendering (i/r), which has one fewer digit than i.

Easy case: solved directly.

Hard case: solved by rendering (–i), which is unsigned.

Page 41: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-41

Example 6 (3)

• Possible implementation in Java:

static String render (int i, int r) {// Render i to the base r, where r is in the range 2…10.

String s = "";if (i < 0) {

s += '-'; s += render(-i, r);} else if (i < r) {

char d = (char)('0' + i);s += d;

} else {char d = (char)('0' + i%r);s += render(i/r, r); s += d;

}return s;

}

Page 42: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-42

Example 7: Towers of Hanoi (1)

• Three vertical poles (1, 2, 3) are mounted on a platform.

• A number of differently-sized disks are threaded on to pole 1, forming a tower with the largest disk at the bottom and the smallest disk at the top.

• We may move one disk at a time, from any pole to any other pole, but we must never place a larger disk on top of a smaller disk.

• Problem: Move the tower of disks from pole 1 to pole 2.

Page 43: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-43

Example 7 (2)

• Animation (with 2 disks):

1 2 31 2 31 2 31 2 3

Page 44: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-44

Example 7 (3)

• Towers of Hanoi algorithm:

To move a tower of n disks from pole source to pole dest:

1. If n = 1:1.1. Move a single disk from source to dest.

2. If n > 1:2.1. Let spare be the remaining pole, other than source and

dest.2.2. Move a tower of (n–1) disks from source to spare.2.3. Move a single disk from source to dest.2.4. Move a tower of (n–1) disks from spare to dest.

3. Terminate.

Page 45: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-45

1. If n = 1:1.1. Move a single disk from source to dest.

2. If n > 1:2.1. Let spare be the remaining pole, other than source and dest.2.2. Move a tower of (n–1) disks from source to spare.2.3. Move a single disk from source to dest.2.4. Move a tower of (n–1) disks from spare to dest.

3. Terminate.

source dest

1. If n = 1:1.1. Move a single disk from source to dest.

2. If n > 1:2.1. Let spare be the remaining pole, other than source and dest.2.2. Move a tower of (n–1) disks from source to spare.2.3. Move a single disk from source to dest.2.4. Move a tower of (n–1) disks from spare to dest.

3. Terminate.

source dest spare

Example 7 (4)

• Animation (with 6 disks):

1. If n = 1:1.1. Move a single disk from source to dest.

2. If n > 1:2.1. Let spare be the remaining pole, other than source and dest.2.2. Move a tower of (n–1) disks from source to spare.2.3. Move a single disk from source to dest.2.4. Move a tower of (n–1) disks from spare to dest.

3. Terminate.

source dest spare

1. If n = 1:1.1. Move a single disk from source to dest.

2. If n > 1:2.1. Let spare be the remaining pole, other than source and dest.2.2. Move a tower of (n–1) disks from source to spare.2.3. Move a single disk from source to dest.2.4. Move a tower of (n–1) disks from spare to dest.

3. Terminate.

source dest spare

1. If n = 1:1.1. Move a single disk from source to dest.

2. If n > 1:2.1. Let spare be the remaining pole, other than source and dest.2.2. Move a tower of (n–1) disks from source to spare.2.3. Move a single disk from source to dest.2.4. Move a tower of (n–1) disks from spare to dest.

3. Terminate.

source dest spare

1. If n = 1:1.1. Move a single disk from source to dest.

2. If n > 1:2.1. Let spare be the remaining pole, other than source and dest.2.2. Move a tower of (n–1) disks from source to spare.2.3. Move a single disk from source to dest.2.4. Move a tower of (n–1) disks from spare to dest.

3. Terminate.

source dest spare

Page 46: 2-1 2 Algorithms Principles. Efficiency. Complexity. O-notation. Recursion. © 2001, D.A. Watt and D.F. Brown

2-46

Example 7 (5)

• Analysis (counting moves):

Let the total no. of moves required to move a tower of n disks be moves(n). Then:

moves(n) = 1 if n = 1moves(n) = 1 + 2 moves(n–1) if n > 1

Solution:

moves(n) = 2n – 1

Time complexity is O(2n).