Lecture 2 Computational Complexity Time Complexity Space Complexity

Download Lecture 2 Computational Complexity Time Complexity Space Complexity

Post on 29-Dec-2015

213 views

Category:

Documents

1 download

Embed Size (px)

TRANSCRIPT

  • Lecture 2Computational Complexity Time Complexity Space Complexity

  • Incomputer science, thetime complexityof analgorithmquantifies the amount of time taken by an algorithm to run as a function of the length of thestringrepresenting the input[1]:226.

  • The Importance of Analyzing the Running Time of an AlgorithmAn example to illustrate the importance of analyzing therunning time of an algorithmProblem: Given an array of n elements A[1..n], sort the entries in A in non-decreasing order.Assumption: Each element comparison takes 10-6 seconds on some computing machine

    Conclusion: Time is undoubtedly an extremely precious resource to be investigated in the analysis of algorithms.Question: How to Analysis Running Time?

    algorithm# of element comparisonsn=128 =27n=1,048,567 =220Selection SortMerge Sort

  • Running TimeHow to measure the efficiency of an algorithm from the point of view of time?The running time usually increases with the growing of input size. So, the running time of an algorithm is usually described as a function of input size.What is the measure of time?How to define the value of the function for input size n ? If the running time is only dependent on input size? If not only dependent on input size, but also on input? Is actual (exact) running time a good measure? The answer is No. Why?Actual time is determined by not only the algorithm, but also many other factors;The measure should be machine or technology independent;Our estimates of times are relative as opposed to absolute;

  • Time complexityIncomputer science, thetime complexityof analgorithmquantifies the amount of time taken by an algorithm to run as a function of the length of thestringrepresenting the input[1]:226.

    Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, where an elementary operation takes a fixed amount of time to perform.

  • Running Time (continued)

    Notice! Different models assume different elementary steps!

    Examples of usually used elementary operations Arithmetic operations: addition, subtraction, multiplication and division.Comparisons and logical operations. Assignments, including assignments of pointers.

  • Example ! For some algorithm, selection sort for example,its running time is only dependent on input size.For any input with input size n, the running time of selection sort algorithm is always cn(n-1)/2 + bn + a for some constants a, b, c. Our main concern is not in small input instances but the behavior of the algorithms under investigation on large input instances, especially with the size of the input in the limit. In this case, we are interested in the order of growth of the running time function.

  • Example But for many algorithms, running time is dependent also on input structure!

  • Example (cont.)Then It is tedious! And tj depends on different input!e.g., if A is initially in sorted order, then for each j=2, , n, the while loop test in line 5 is only executed once, thus,tj=1 for j=2, ,n. And we have:Which is a linear function, and the order of growth of the function isn.

  • Example (cont.)If the array is initially in decreasing order.Then for each j=2, ,n, the while loop test is executed j times, thus tj=j for j=2, ,n. Thus, we have

    Which is a quadratic function, the order of growth of which is n2

  • Asymptotic notionsHow to define running time in function of input size when running time is dependent also on input?

    For simplicity, we need asymptotic notions!Our main concern is not in small input instances but the behavior of the algorithms under investigation on large input instances, especially with the size of the input in the limit. In this case, we are interested in the order of growth of the running time function.

  • Asymptotic Notationse.g., the running time of INSERTION-SORT falls between a1n+b1 and a2n2+b2n+c2, and neither of them can represent the whole case. By using the asymptotic notations, the running time of INSERTION-SORT can be simplified. In this course, we use three notations: O(.) : Big-Oh the most used(.) : Big omega(.) : Big theta

  • O-notationInformal definition of O(.): If an algorithms running time t(n) is bounded above by a function g(n), to within a constant multiple c, for n n0, we say that the running time of the algorithm is O(g(n)), and the algorithm is an O(g(n))-algorithm.Obviously, O-notation is used to bound the worst-case running time of an algorithm

  • O-notationFormal definition of O(.): A function t(n) is said to be in O(g(n)), if there exist some c > 0, n0 > 0, such that 0 t(n) cg(n), for all n n0. Sometimes, also denote as t(n) = O(g(n)).

  • -notationInformal definition of (.) : If an algorithms running time t(n) is bounded below by a function g(n), to within a constant multiple c, for n n0, we say that the running time of the algorithm is (g(n)), and the algorithm is called an (g(n))-algorithm.

    Dont care

    cg(n)

  • -notationFormal definition of (.) : A function t(n) is said to be in (g(n)), if there exist some c > 0, n0 > 0, such that t(n) cg(n) 0, for all n n0. Sometimes, also denote as t(n) = (g(n)).Obviously, -notation is used to bound the best-case running time of an algorithm

    Dont care

    cg(n)

  • -notationInformal definition of (.) : If an algorithms running time t(n) is bounded above and below by a function g(n), to within a constant multiple c1 and c2, for n n0, we say that the running time of the algorithm is (g(n)), and the algorithm is called a (g(n))-algorithm, or of order (g(n)).

    Dont care

    c2g(n)

  • -notationFormal definition of (.) : A function t(n) is said to be in (g(n)) if there exist c1 > 0, c2 > 0, and n0 > 0, such that 0 c2g(n) t(n) c1g(n), for all n n0. Sometimes, also denote as t(n) = (g(n)).

    g(n) is an asymptotically tight bound for t(n)

    Dont care

    c2g(n)

  • Example The running time of INSERTION-SORT is O(n2), and (n), which means the running time of every input of size n for n n0 is up-bounded by a constant times n2, and lower-bounded by a constant times n when n is sufficiently large.

    Notice that for some algorithm, INSERTION-SORT for example, there does not exist a function g(n) such that the running time of the algorithm is (g(n)).

  • Illustration of some typical asymptotic running time functions

    We can see:The linear algorithm is obviously slower than the quadratic one and faster than logarithmic one, etc..

  • Three Types of AnalysisGiven input size, the running time may vary on different input instance, e.g., the running time of INSERTION-SORT falls between linear and quadratic functions. Can we give an exact order?To have an overall grading on the performance of an algorithm, we consider:

    Best-Case Analysis: Too optimistic

    Average-Case Analysis: Too difficult, e.g. the difficulty to define average case, the difficulty related with mathematics. And most time average-case running time is in the same order as worst-case running time.

    Worst-Case Analysis: Very useful and practical. We will adopt this approach.

  • Example of Three Types of Analysis

  • Example of analysis (average-case)You should first figure out the distribution space of the instances of size n. Then compute an expected running time.Usually, we assume all possibilities are equally likely. Thus, in this example, in the while loop, on average, half of the elements in A[1, ..,j-1] are less than A[j], and half are greater than A[j]. Then on average, tj=j/2, and we can see the average-case time complexity is also quadratic.

  • Time complexityWorst-case time complexity of an algorithm is the longest running time (or maximum number of elementary operations ) taken on any input of size n. E.g., the worst-case time complexity of INSERTION-SORT (or the running time of INSERTION-SORT in the worst case) is (n2), which is an asymptotic tight bound. The running time (time complexity) of INSERTION-SORT is O(n2), (n). But the running time (time complexity) of INSERTION-SORT is NOT (n2), NOT (n2), and is NOT O(n). We cannot give a tight bound on the running time for all inputs. We usually consider one algorithm to be more efficient than another if its worst-case running time has a lower order of growth.

    Since an algorithm's performance time may vary with different inputs of the same size, one commonly uses theworst-case time complexityof an algorithm, denoted asT(n), which is defined as the maximum amount of time taken on any input of sizen.

  • Worst-case time complexityIt is NOT true that each algorithm has a tight bound on its worst-case time complexity.E.g., suppose A is a sorted array of numbers, x is a number.n is the length of A. The following algorithm decides whether x is in A. 1. if n is odd k BINARYSEARCH (A, x) 2. else k LINEARSEARCH (A, x)Obviously, the running time of this algorithm is O(n) for each input, and thus in the worst case. And for each constant n0, there are infinitely many inputs whose size is larger than n0 and whose cost is lower-bounded by a constant times n, but, the running time in the worst case is NOT (n), and NOT (n), since, for each n0, there exists some nn0 such that no input of that size costs no less than a constant times n.

  • Input sizeInput size is the length of the string representing the input.Under TURING model, this is easy to define, that is the number of nonblank cells the input occupies on the input tape.In real world, this is impossible. The input size is not a precise measure of the input, and its interpretation is

Recommended

View more >