Asymptotic analysis examples solutions

asymptotic analysis for periodic structures and asymptotic analysis in research methodology
Dr.SethPatton Profile Pic
Dr.SethPatton,United Kingdom,Teacher
Published Date:22-07-2017
Your Website URL(Optional)
ECE 250 Algorithms and Data Structures Asymptotic Analysis Douglas Wilhelm Harder, M.Math. LEL Department of Electrical and Computer Engineering University of Waterloo Waterloo, Ontario, Canada © 2006-2013 by Douglas Wilhelm Harder. Some rights reserved.Asymptotic Analysis 2 2.3.1 Asymptotic Analysis In general, we will always analyze algorithms with respect to one or more variables We will begin with one variable: – The number of items n currently stored in an array or other data structure – The number of items expected to be stored in an array or other data structure – The dimensions of an n × n matrix Examples with multiple variables: – Dealing with n objects stored in m memory locations – Multiplying a k × m and an m × n matrix – Dealing with sparse matrices of size n × n with m non-zero entriesAsymptotic Analysis 3 2.3.1 Maximum Value For example, the time taken to find the largest object in an array of n random integers will take n operations int find_max( int array, int n ) int max = array0; for ( int i = 1; i n; ++i ) if ( arrayi max ) max = arrayi; return max; Asymptotic Analysis 4 2.3.1 Maximum Value One comment: – In this class, we will look at both simple C++ arrays and the standard template library (STL) structures – Instead of using the built-in array, we could use the STL vector class – The vector class is closer to the C/Java arrayAsymptotic Analysis 5 2.3.1 Maximum Value include vector int find_max( std::vectorint array ) if ( array.size() == 0 ) throw underflow(); int max = array0; for ( int i = 1; i array.size(); ++i ) if ( arrayi max ) max = arrayi; return max; Asymptotic Analysis 6 2.3.2 Linear and binary search There are other algorithms which are significantly faster as the problem size increases This plot shows maximum and average number of comparisons to find an entry in a sorted array of size n – Linear search – Binary search nAsymptotic Analysis 7 2.3.3 Asymptotic Analysis Given an algorithm: – We need to be able to describe these values mathematically – We need a systematic means of using the description of the algorithm together with the properties of an associated data structure – We need to do this in a machine-independent way For this, we need Landau symbols and the associated asymptotic analysisAsymptotic Analysis 8 2.3.3 Quadratic Growth Consider the two functions 2 2 f(n) = n and g(n) = n – 3n + 2 Around n = 0, they look very differentAsymptotic Analysis 9 2.3.3 Quadratic Growth Yet on the range n = 0, 1000, they are (relatively) indistinguishable:Asymptotic Analysis 10 2.3.3 Quadratic Growth The absolute difference is large, for example, f(1000) = 1 000 000 g(1000) = 997 002 but the relative difference is very small f(1000) g(1000)  0.002998 0.3% f(1000) and this difference goes to zero as n → ∞Asymptotic Analysis 11 2.3.3 Polynomial Growth To demonstrate with another example, 6 6 5 4 3 2 f(n) = n and g(n) = n – 23n +193n –729n +1206n – 648n Around n = 0, they are very differentAsymptotic Analysis 12 2.3.3 Polynomial Growth Still, around n = 1000, the relative difference is less than 3%Asymptotic Analysis 13 2.3.3 Polynomial Growth The justification for both pairs of polynomials being similar is that, in both cases, they each had the same leading term: 2 6 n in the first case, n in the second Suppose however, that the coefficients of the leading terms were different – In this case, both functions would exhibit the same rate of growth, however, one would always be proportionally largerAsymptotic Analysis 14 2.3.4 Examples We will now look at two examples: – A comparison of selection sort and bubble sort – A comparison of insertion sort and quicksortAsymptotic Analysis 15 Counting Instructions Suppose we had two algorithms which sorted a list of size n and the run time (in ms) is given by 2 b (n) = 4.7n – 0.5n + 5 Bubble sort worst 2 b (n) = 3.8n + 0.5n + 5 best 2 s(n) = 4n + 14n + 12 Selection sort The smaller the value, the fewer instructions are run – For n ≤ 21, b (n) s(n) worst – For n ≥ 22, b (n) s(n) worstAsymptotic Analysis 16 Counting Instructions With small values of n, the algorithm described by s(n) requires more instructions than even the worst-case for bubble sortAsymptotic Analysis 17 Counting Instructions Near n = 1000, b (n) ≈ 1.175 s(n) and b (n) ≈ 0.95 s(n) worst bestAsymptotic Analysis 18 Counting Instructions Is this a serious difference between these two algorithms? Because we can count the number instructions, we can also estimate how much time is required to run one of these algorithms on a computerAsymptotic Analysis 19 Counting Instructions Suppose we have a 1 GHz computer – The time (s) required to sort a list of up to n = 10 000 objects is under half a secondAsymptotic Analysis 20 Counting Instructions To sort a list with one million elements, it will take about 1 h Bubble sort could, under some conditions, be 200 s faster