Question? Leave a message!




Algorithm Analysis

Algorithm Analysis
ECE 250 Algorithms and Data Structures Algorithm Douglas Wilhelm Harder, M.Math. LEL Analysis Department of Electrical and Computer Engineering University of Waterloo Waterloo, Ontario, Canada ece.uwaterloo.ca dwharderalumni.uwaterloo.ca © 20062013 by Douglas Wilhelm Harder. Some rights reserved.Asymptotic Analysis 2 Outline In this topic, we will examine code to determine the run time of various operations We will introduce machine instructions We will calculate the run times of: – Operators +, , =, +=, ++, etc. – Control statements if, for, while, dowhile, switch – Functions – Recursive functionsAsymptotic Analysis 3 Motivation The goal of algorithm analysis is to take a block of code and determine the asymptotic run time or asymptotic memory requirements based on various parameters – Given an array of size n: 2 • Selection sort requires Q(n ) time • Merge sort, quick sort, and heap sort all require Q(n ln(n)) time – However: • Merge sort requires Q(n) additional memory • Quick sort requires Q(ln(n)) additional memory • Heap sort requires Q(1) memory Asymptotic Analysis 4 Motivation The asymptotic behaviour of algorithms indicates the ability to scale – Suppose we are sorting an array of size n 2 Selection sort has a run time of Q(n ) 2 2 – 2n entries requires (2n) = 4n • Four times as long to sort 2 2 – 10n entries requires (10n) = 100n • One hundred times as long to sortAsymptotic Analysis 5 Motivation The other sorting algorithms have Q(n ln(n)) run times – 2n entries require (2n) ln(2n) = (2n) (ln(n) + 1) = 2(n ln(n)) + 2n – 10n entries require (10n) ln(10n) = (10n) (ln(n) + 1) = 10(n ln(n)) + 10n In each case, it requires Q(n) more time However: – Merge sort will require twice and 10 times as much memory – Quick sort will require one or four additional memory locations – Heap sort will not require any additional memoryAsymptotic Analysis 6 Motivation lg(3) We will see algorithms which run in Q(n ) time – lg(3) ≈ 1.585 lg(3) lg(3) lg(3) lg(3) – For 2n entries require (2n) = 2 n = 3 n • Three times as long to sort lg(3) lg(3) lg(3) lg(3) – 10n entries require (10n) = 10 n = 38.5 n • 38 times as long to sort Asymptotic Analysis 7 Motivation lg(3) We will see algorithms which run in Q(n ) time – lg(3) ≈ 1.585 lg(3) lg(3) lg(3) lg(3) – For 2n entries require (2n) = 2 n = 3 n • Three times as long to sort lg(3) lg(3) lg(3) lg(3) – 10n entries require (10n) = 10 n = 38.5 n • 38 times as long to sort Binary search runs in Q(ln(n)) time: – Doubling the size requires one additional searchAsymptotic Analysis 8 Motivation If we are storing objects which are not related, the hash table has, in many cases, optimal characteristics: – Many operations are Q(1) – I.e., the run times are independent of the number of objects being stored If we are required to store both objects and relations, both memory and time will increase – Our goal will be to minimize this increaseAsymptotic Analysis 9 Motivation To properly investigate the determination of run times asymptotically: – We will begin with machine instructions – Discuss operations – Control statements • Conditional statements and loops – Functions – Recursive functionsAsymptotic Analysis 10 Machine Instructions Given any processor, it is capable of performing only a limited number of operations These operations are called instructions The collection of instructions is called the instruction set – The exact set of instructions differs between processors – MIPS, ARM, x86, 6800, 68k – You will see another in the ColdFire in ECE 222 • Derived from the 68000, derived from the 6800Asymptotic Analysis 11 Machine Instructions Any instruction runs in a fixed amount of time (an integral number of CPU cycles) An example on the Coldfire is: 0x06870000000F th which adds 15 to the 7 data register As humans are not good at hex, this can be programmed in assembly language as ADDI.L F, D7 – More in ECE 222Asymptotic Analysis 12 Machine Instructions Assembly language has an almost onetoone translation to machine instructions – Assembly language is a lowlevel programming language Other programming languages are higherlevel: Fortran, Pascal, Matlab, Java, C++, and C The adjective ―high‖ refers to the level of abstraction: – Java, C++, and C have abstractions such as OO – Matlab and Fortran have operations which do not map to relatively small number of machine instructions: 1.272.9 1.272.9 in Fortran 2.0000036616123606774Asymptotic Analysis 13 Machine Instructions The C programming language (C++ without objects and other abstractions) can be referred to as a midlevel programming language – There is abstraction, but the language is closely tied to the standard capabilities – There is a closer relationship between operators and machine instructions Consider the operation a += b; – Assume that the compiler has already has the value of the variable a in register D1 and perhaps b is a variable stored at the location stored in address register A1, this is then converted to the single instruction ADD (A1), D1Asymptotic Analysis 14 Operators Because each machine instruction can be executed in a fixed number of cycles, we may assume each operation requires a fixed number of cycles – The time required for any operator is Q(1) including: • Retrieving/storing variables from memory • Variable assignment = • Integer operations + / ++ • Logical operations • Bitwise operations • Relational operations == = = = • Memory allocation and deallocation new deleteAsymptotic Analysis 15 Operators Of these, memory allocation and deallocation are the slowest by a significant factor – A quick test on eceunix shows a factor of over 100 – They require communication with the operation system – This does not account for the time required to call the constructor and destructor Note that after memory is allocated, the constructor is run – The constructor may not run in Q(1) timeAsymptotic Analysis 16 Blocks of Operations Each operation runs in Q(1) time and therefore any fixed number of operations also run in Q(1) time, for example: // Swap variables a and b int tmp = a; a = b; b = tmp; // Update a sequence of values // ece.uwaterloo.ca/ece250/Algorithms/Skiplists/src/Skiplist.h ++index; prevmodulus = modulus; modulus = nextmodulus; nextmodulus = modulustableindex;Asymptotic Analysis 17 Blocks of Operations Seldom will you find large blocks of operations without any additional control statements This example rearranges an AVL tree structure Treenode lrl = leftrightleft; Treenode lrr = leftrightright; parent = leftright; parentleft = left; parentright = this; leftright = lrl; left = lrr; Run time: Q(1)Asymptotic Analysis 18 Blocks in Sequence Suppose you have now analyzed a number of blocks of code run in sequence template typename T void updatecapacity( int delta ) Q(1) T arrayold = array; int capacityold = arraycapacity; arraycapacity += delta; array = new Tarraycapacity; for ( int i = 0; i capacityold; ++i ) Q(n) arrayi = arrayoldi; delete arrayold;Q(1) or W(n) To calculate the total run time, add the entries: Q(1 + n + 1) = Q(n)Asymptotic Analysis 19 Blocks in Sequence This is taken from code found at http://ece.uwaterloo.ca/ece250/Algorithms/Sparsesystems/ template int M, int N MatrixM, N MatrixM, N::operator= ( MatrixM, N const A ) if ( A == this ) return this; Q(1) if ( capacity = A.capacity ) delete columnindex; delete offdiagonal; capacity = A.capacity; Q(1 + 1 + min(M, N) + M + n + 1) Q(1) columnindex = new intcapacity; offdiagonal = new doublecapacity; = Q(M + n) for ( int i = 0; i minMN; ++i ) diagonali = A.diagonali; Q(min(M, N)) for ( int i = 0; i = M; ++i ) rowindexi = A.rowindexi; Note that min(M, N) ≤ M but we Q(M) cannot say anything about M and n for ( int i = 0; i A.size(); ++i ) columnindexi = A.columnindexi; offdiagonali = A.offdiagonali; Q(n) return this; Q(1) Asymptotic Analysis 20 Blocks in Sequence Other examples include: 2 – Run three blocks of code which are Q(1), Q(n ), andQ(n) 2 2 Total run time Q(1 + n + n) = Q(n ) 1.5 – Run two blocks of code which are Q(n ln(n)), andQ(n ) 1.5 1.5 Total run time Q(n ln(n) + n ) = Q(n ) Recall this linear ordering from the previous topic – When considering a sum, take the dominant termAsymptotic Analysis 21 Control Statements Next we will look at the following control statements These are statements which potentially alter the execution of instructions – Conditional statements if, switch – Conditioncontrolled loops for, while, dowhile – Countcontrolled loops for i from 1 to 10 do ... end do; Maple – Collectioncontrolled loops foreach ( int i in array ) ... // CAsymptotic Analysis 22 Control Statements Given any collection of nested control statements, it is always necessary to work inside out – Determine the run times of the innermost statements and work your way outAsymptotic Analysis 23 Control Statements Given if ( condition ) // true body else // false body The run time of a conditional statement is: – the run time of the condition (the test), plus – the run time of the body which is run In most cases, the run time of the condition is Q(1)Asymptotic Analysis 24 Control Statements In some cases, it is easy to determine which statement must be run: int factorial ( int n ) if ( n == 0 ) return 1; else return n factorial ( n – 1 ); Asymptotic Analysis 25 Control Statements In others, it is less obvious – Find the maximum entry in an array: int findmax( int array, int n ) max = array0; for ( int i = 1; i n; ++i ) if ( arrayi max ) max = arrayi; return max; Asymptotic Analysis 26 Analysis of Statements In this case, we don’t know If we had information about the distribution of the entries of the array, we may be able to determine it – if the list is sorted (ascending) it will always be run – if the list is sorted (descending) it will be run once – if the list is uniformly randomly distributed, thenAsymptotic Analysis 27 Conditioncontrolled Loops The C++ for loop is a condition controlled statement: for ( int i = 0; i N; ++i ) // ... is identical to int i = 0; // initialization while ( i N ) // condition // ... ++i; // increment Asymptotic Analysis 28 Conditioncontrolled Loops The initialization, condition, and increment usually are single statements running in Q(1) for ( int i = 0; i N; ++i ) // ... Asymptotic Analysis 29 Conditioncontrolled Loops The initialization, condition, and increment statements are usually Q(1) For example, for ( int i = 0; i n; ++i ) // ... Assuming there are no break or return statements in the loop, the run time is W(n)Asymptotic Analysis 30 Conditioncontrolled Loops If the body does not depend on the variable (in this example, i), then the run time of for ( int i = 0; i n; ++i ) // code which is Theta(f(n)) is Q(n f(n)) If the body is O(f(n)), then the run time of the loop is O(n f(n))Asymptotic Analysis 31 Conditioncontrolled Loops For example, int sum = 0; for ( int i = 0; i n; ++i ) sum += 1; Theta(1) This code has run time Q(n·1) = Q(n)Asymptotic Analysis 32 Conditioncontrolled Loops Another example example, int sum = 0; for ( int i = 0; i n; ++i ) for ( int j = 0; j n; ++j ) sum += 1; Theta(1) The previous example showed that the inner loop is Q(n), thus the outer loop is 2 Q(n·n) = Q(n )Asymptotic Analysis 33 Analysis of Repetition Statements Suppose with each loop, we use a linear search an array of size m: for ( int i = 0; i n; ++i ) // search through an array of size m // O( m ); The inner loop is O(m) and thus the outer loop is O(n m)Asymptotic Analysis 34 Conditional Statements Consider this example void Disjointsets::clear() if ( sets == n ) Q(1) return; maxheight = 0; numdisjointsets = n; Q(1) for ( int i = 0; i n; ++i ) parenti = i; Q(n) treeheighti = 0; Q(1) Q(1) sets n  T (n)  clear Q(n) otherwise Asymptotic Analysis 35 Analysis of Repetition Statements If the body does depends on the variable (in this example, i), then the run time of for ( int i = 0; i n; ++i ) // code which is Theta(f(i,n)) n1   is Θ 1  1  f( i, n) and if the body is    i0 O(f(i, n)), the result is n1   O 1 1 f(i, n)    i0Asymptotic Analysis 36 Analysis of Repetition Statements For example, int sum = 0; for ( int i = 0; i n; ++i ) for ( int j = 0; j i; ++j ) sum += i + j; The inner loop is O(1 + i(1 + 1) ) = Q(i) hence the outer is n1 n1  n(n1)  2  Θ 1 1 iΘ 1 n iΘ 1 nΘn    2   i0 i0Asymptotic Analysis 37 Analysis of Repetition Statements As another example: int sum = 0; for ( int i = 0; i n; ++i ) for ( int j = 0; j i; ++j ) for ( int k = 0; k j; ++k ) sum += i + j + k; From inside to out: Q(1) Q(j) 2 Q(i ) 3 Q(n )Asymptotic Analysis 38 Control Statements Switch statements appear to be nested if statements: switch( i ) case 1: / do stuff / break; case 2: / do other stuff / break; case 3: / do even more stuff / break; case 4: / well, do stuff / break; case 5: / tired yet / break; default: / do default stuff / Asymptotic Analysis 39 Control Statements Thus, a switch statement would appear to run in O(n) time where n is the number of cases, the same as nested if statements – Why then not use: if ( i == 1 ) / do stuff / else if ( i == 2 ) / do other stuff / else if ( i == 3 ) / do even more stuff / else if ( i == 4 ) / well, do stuff / else if ( i == 5 ) / tired yet / else / do default stuff / Asymptotic Analysis 40 Control Statements Question: Why would you introduce something into programming language which is redundant There are reasons for this: – your name is Larry Wall and you are creating the Perl (not PERL) programming language – you are introducing software engineering constructs, for example, classesAsymptotic Analysis 41 Control Statements However, switch statements were included in the original C language... why First, you may recall that the cases must be actual values, either: – integers – characters For example, you cannot have a case with a variable, e.g., case n: / do something / break; //badAsymptotic Analysis 42 Control Statements The compiler looks at the different cases and calculates an appropriate jump For example, assume: – the cases are 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 – each case requires a maximum of 24 bytes (for example, six instructions) Then the compiler simply makes a jump size based on the variable, jumping ahead either 0, 24, 48, 72, ..., or 240 instructionsAsymptotic Analysis 43 Serial Statements Suppose we run one block of code followed by another block of code Such code is said to be run serially If the first block of code is O(f(n)) and the second is O(g(n)), then the run time of two blocks of code is O( f(n) + g(n) ) which usually (for algorithms not including function calls) simplifies to one or the otherAsymptotic Analysis 44 Serial Statements Consider the following two problems: – search through a random list of size n to find the maximum entry, and – search through a random list of size n to find if it contains a particular entry What is the proper means of describing the run time of these two algorithmsAsymptotic Analysis 45 Serial Statements Searching for the maximum entry requires that each element in the array be examined, thus, it must run in Q(n) time Searching for a particular entry may end earlier: for example, the first entry we are searching for may be the one we are looking for, thus, it runs in O(n) timeAsymptotic Analysis 46 Serial Statements Therefore: – if the leading term is bigQ, then the result must be bigQ, otherwise – if the leading term is bigO, we can say the result is bigO For example, 2 4 2 4 4 O(n) + O(n ) + O(n ) = O(n + n + n ) = O(n ) 2 2 O(n) + Q(n ) = Q(n ) 2 2 O(n ) + Q(n) = O(n ) 2 2 2 O(n ) + Q(n ) = Q(n )Asymptotic Analysis 47 Functions A function (or subroutine) is code which has been separated out, either to: – and repeated operations • e.g., mathematical functions – group related tasks • e.g., initializationAsymptotic Analysis 48 Functions Because a subroutine (function) can be called from anywhere, we must: – prepare the appropriate environment – deal with arguments (parameters) – jump to the subroutine – execute the subroutine – deal with the return value – clean upAsymptotic Analysis 49 Functions Fortunately, this is such a common task that all modern processors have instructions that perform most of these steps in one instruction Thus, we will assume that the overhead required to make a function call and to return is Q(1) an – We will discuss this later (stacks/ECE 222)Asymptotic Analysis 50 Functions Because any function requires the overhead of a function call and return, we will always assume that T = W(1) f That is, it is impossible for any function call to have a zero run timeAsymptotic Analysis 51 Functions Thus, given a function f(n) (the run time of which depends on n) we will associate the run time of f(n) by some function T (n) f – We may write this to T(n) Because the run time of any function is at least O(1), we will include the time required to both call and return from the function in the run timeAsymptotic Analysis 52 Functions Consider this function: void Disjointsets::setunion( int m, int n ) m = find( m ); 2T find n = find( n ); if ( m == n ) return; T = 2T + Q(1) setunion find numdisjointsets; if ( treeheightm = treeheightn ) parentn = m; Q(1) if ( treeheightm == treeheightn ) ++( treeheightm ); maxheight = std::max( maxheight, treeheightm ); else parentm = n; Asymptotic Analysis 53 Recursive Functions A function is relatively simple (and boring) if it simply performs operations and calls other functions Most interesting functions designed to solve problems usually end up calling themselves – Such a function is said to be recursiveAsymptotic Analysis 54 Recursive Functions As an example, we could implement the factorial function recursively: int factorial( int n ) if ( n = 1 ) return 1; Q(1) else return n factorial( n – 1 ); T (n1)Q(1) Asymptotic Analysis 55 Recursive Functions Thus, we may analyze the run time of this function as follows: Θ(1) n 1  T (n)  T (n1)Θ(1) n 1  We don’t have to worry about the time of the conditional (Q(1)) nor is there a probability involved with the conditional statementAsymptotic Analysis 56 Recursive Functions The analysis of the run time of this function yields a recurrence relation: T (n) = T (n – 1) + Q(1) T (1) = Q(1) This recurrence relation has Landau symbols… – Replace each Landau symbol with a representative function: T (n) = T (n – 1) + 1 T (1) = 1 Asymptotic Analysis 57 Recursive Functions Thus, to find the run time of the factorial function, we need to solve T (n) = T (n – 1) + 1 T (1) = 1 The fastest way to solve this is with Maple: rsolve( T(n) = T(n – 1) + 1, T(1) = 1, T(n) ); n Thus, T (n) = Q(n) Asymptotic Analysis 58 Recursive Functions Unfortunately, you don’t have Maple on the examination, thus, we can examine the first few steps: T (n) = T (n – 1) + 1 = T (n – 2) + 1 + 1 = T (n – 2) + 2 = T (n – 3) + 3 From this, we see a pattern: T (n) = T (n – k) + k Asymptotic Analysis 59 Recursive Functions If k = n – 1 then T (n) = T (n – (n – 1)) + n – 1 = T (1) + n – 1 = 1 + n – 1 = n Thus, T (n) = Q(n) Asymptotic Analysis 60 Recursive Functions Incidentally, we may write the factorial function using the ternary : operator – Ternary operators take three arguments—C++ has only one int factorial( int n ) return (n = 1) 1 : n factorial( n – 1 ); Asymptotic Analysis 61 Recursive Functions Suppose we want to sort a array of n items We could: – go through the list and find the largest item – swap the last entry in the list with that largest item – then, go on and sort the rest of the array This is called selection sortAsymptotic Analysis 62 Recursive Functions void sort( int array, int n ) if ( n = 1 ) return; // special case: 0 or 1 items are always sorted int posn = 0; // assume the first entry is the smallest int max = arrayposn; for ( int i = 1; i n; ++i ) // search through the remaining entries if ( arrayi max ) // if a larger one is found posn = i; // update both the position and value max = arrayposn; int tmp = arrayn 1; // swap the largest entry with the last arrayn 1 = arrayposn; arrayposn = tmp; sort( array, n – 1 ); // sort everything else Asymptotic Analysis 63 Recursive Functions We could call this function as follows: int array7 = 5, 8, 3, 6, 2, 4, 7; sort( array, 7 ); // sort an array of seven itemsAsymptotic Analysis 64 Recursive Functions The first call finds the largest elementAsymptotic Analysis 65 Recursive Functions nd The next call finds the 2 largest elementAsymptotic Analysis 66 Recursive Functions rd The third finds the 3 largestAsymptotic Analysis 67 Recursive Functions th And the 4Asymptotic Analysis 68 Recursive Functions th And the 5Asymptotic Analysis 69 Recursive Functions th Finally the 6Asymptotic Analysis 70 Recursive Functions And the array is sorted:Asymptotic Analysis 71 Recursive Functions Analyzing the function, we get:Asymptotic Analysis 72 Recursive Functions Thus, replacing each Landau symbol with a representative, we are required to solve the recurrence relation T(n) = T(n – 1) + n T(1) = 1 The easy way to solve this is with Maple: rsolve( T(n) = T(n – 1) + n, T(1) = 1, T(n) );  n  1n(n1)1   2 expand( ); 1 1 2  n n 2 2Asymptotic Analysis 73 Recursive Functions Consequently, the sorting routine has the run time 2 T(n) = Q(n ) To see this by hand, consider the following T(n) T(n1) n T(n 2) (n1) n  T(n 2) n (n1)  T(n 3) n (n1) (n 2)  n n n n(n 1)  T(1) i 1 i i  2 i2 i2 i1Asymptotic Analysis 74 Recursive Functions Consider, instead, a binary search of a sorted list: – Check the middle entry – If we do not find it, check either the left or righthand side, as appropriate Thus, T(n) = T((n – 1)/2) + Q(1)Asymptotic Analysis 75 Recursive Functions Also, if n = 1, then T(1) = Q(1) Thus we have to solve: 1 n 1   n1  T(n)  T 1 n 1   2   Solving this can be difficult, in general, so we will consider only special values of n k Assume n = 2 – 1 where k is an integer k k – 1 Then (n – 1)/2 = (2 – 1 – 1)/2 = 2 – 1Asymptotic Analysis 76 Recursive Functions For example, searching a list of size 31 requires us to check the center If it is not found, we must check one of the two halves, each of which is size 15 5 31 = 2 – 1 4 15 = 2 – 1Asymptotic Analysis 77 Recursive Functions Thus, we can write k T(n) T(21) k  211   T 1  2  k1  T(21) 1 k1  211   T 1 1  2  k2  T(21) 2 Asymptotic Analysis 78 Recursive Functions Notice the pattern with one more step: k1  T(21) 1 k1  211   T 1 1  2  k2  T(21) 2 k3  T(21) 3 Asymptotic Analysis 79 Recursive Functions Thus, in general, we may deduce that after k – 1 steps: k T(n) T(21) k(k1)  T(21) k1  T(1) k1 k because T(1) = 1Asymptotic Analysis 80 Recursive Functions k Thus, T(n) = k, but n = 2 – 1 Therefore k = lg(n + 1) fn   0  c   However, recall that f(n) = Q(g(n)) if l im  c for n  gn   1 lgnn  1 1 ln 2       n 1 lim lim lim n n n 1 lnnn1 ln 2 ln 2         n Thus, T(n) = Q(lg(n + 1)) = Q(ln(n))Asymptotic Analysis 81 Cases As well as determining the run time of an algorithm, because the data may not be deterministic, we may be interested in: – Bestcase run time – Averagecase run time – Worstcase run time In many cases, these will be significantly differentAsymptotic Analysis 82 Cases Searching a list linearly is simple enough We will count the number of comparisons – Best case: • The first element is the one we’re looking for: O(1) – Worst case: • The last element is the one we’re looking for, or it is not in the list: O(n) – Average case • We need some information about the list...Asymptotic Analysis 83 Cases Assume the case we are looking for is in the list and equally likely distributed If the list is of size n, then there is a 1/n chance of it being in the ith location Thus, we sum n 1 1 n(n1) n1 i  n n 2 2 i1 which is O(n)Asymptotic Analysis 84 Cases Suppose we have a different distribution: – there is a 50 chance that the element is the first – for each subsequent element, the probability is reduced by ½ We could write: n i i   i i 2 2 i1 i1Asymptotic Analysis 85 Cases You’re not expected to know this for the final, however, for interest: sum( i/2i, i = 1..infinity ); 2 Thus, the average case is O(1)Asymptotic Analysis 86 Cases Previously, we had an example where we were looking for the number of times a particular assignment statement was executed: int findmax( int array, int n ) max = array0; for ( int i = 1; i n; ++i ) if ( arrayi max ) max = arrayi; return max; Asymptotic Analysis 87 Cases This example is taken from Preiss – The best case was once (first element is largest) – The worst case was n times For the average case, we must consider: th – What is the probability that the i object is the largest of the first i objectsAsymptotic Analysis 88 Cases To consider this question, we must assume that elements in the array are evenly distributed Thus, given a sublist of size k, the probability that any one element is the largest is 1/k Thus, given a value i, there are i + 1 objects, hence n1 n 1 1   i1 i i0 i1Asymptotic Analysis 89 Cases We can approximate the sum by an integral – what is the area under:Asymptotic Analysis 90 Cases We can approximate this by the 1/(x + 1) integrated from 0 to nAsymptotic Analysis 91 Cases From calculus: nn1 11 n1 dx dx ln(x) ln(n1) ln(1) ln(n1)  1 xx1 01 How about the error Our approximation would be useless if the error was O(n)Asymptotic Analysis 92 Cases Consider the following image which highlights the errors – The errors can be fit into the box 0, 1 × 0, 1Asymptotic Analysis 93 Cases Consequently, the error must be 1 In fact, it converges to g ≈ 0.57721566490 – Therefore, the error is Q(1)Asymptotic Analysis 94 Cases Thus, the number of times that the assignment statement will be executed, assuming an even distribution is O(ln(n))Asymptotic Analysis 95 Cases Thus, the total run of: int findmax( int array, int n ) max = array0; for ( int i = 1; i n; ++i ) if ( arrayi max ) max = arrayi; return max; n1  1  is Q 1 1Q 1 n ln(n)Q n     i1  i1 Asymptotic Analysis 96 Summary In these slides we have looked at: – The run times of • Operators • Control statements • Functions • Recursive functions – We have also defined best, worst, and averagecase scenarios We will be considering all of these each time we inspect any algorithm used in this class
Website URL
Comment