Question? Leave a message!




Greedy algorithms

Greedy algorithms
ECE 250 Algorithms and Data Structures Greedy algorithms Douglas Wilhelm Harder, M.Math. LEL Department of Electrical and Computer Engineering University of Waterloo Waterloo, Ontario, Canada ece.uwaterloo.ca dwharderalumni.uwaterloo.ca © 20062013 by Douglas Wilhelm Harder. Some rights reserved.Greedy algorithms 2 Algorithm Design To now, we have examined a number of data structures and algorithms to manipulate them We have seen examples of efficient strategies – Divide and conquer • Binary search • Depthfirst tree traversals • Merge sort • Quicksort – Greedy algorithms • Prim’s algorithm • Kruskal’s algorithm • Dijkstra’s algorithmGreedy algorithms 3 Greedy algorithms This topic will cover greedy algorithms: – Definitions – Examples • Making change • Prim’s and Dijkstra’s algorithms – Other examplesGreedy algorithms 4 Greedy algorithms Suppose it is possible to build a solution through a sequence of partial solutions – At each step, we focus on one particular partial solution and we attempt to extend that solution – Ultimately, the partial solutions should lead to a feasible solution which is also optimalGreedy algorithms 5 Making change Consider this commonplace example: – Making the exact change with the minimum number of coins – Consider the Euro denominations of 1, 2, 5, 10, 20, 50 cents – Stating with an empty set of coins, add the largest coin possible into the set which does not go over the required amountGreedy algorithms 6 Making change To make change for €0.72: – Start with €0.50 Total €0.50Greedy algorithms 7 Making change To make change for €0.72: – Start with €0.50 – Add a €0.20 Total €0.70Greedy algorithms 8 Making change To make change for €0.74: – Start with €0.50 – Add a €0.20 – Skip the €0.10 and the €0. 05 but add a €0.02 Total €0.72Greedy algorithms 9 Making change Notice that each digit can be worked with separately – The maximum number of coins for any digit is three – Thus, to make change for anything less than €1 requires at most six coins – The solution is optimalGreedy algorithms 10 Making change To make change for 0.72¢ requires six Canadian coins – However, we have only four coins less than 1 – This is still, however, optimalGreedy algorithms 11 Making change Does this strategy always work – What if our coin denominations grow quadraticly Consider 1, 4, 9, 16, 25, 36, and 49 dumbledores Reference: J.K. Rowlings, Harry Potter, Raincoast Books, 1997.Greedy algorithms 12 Making change Using our algorithm, to make change for 72 dumbledores, we require six coins: 72 = 49 + 16 + 4 + 1 + 1 + 1Greedy algorithms 13 Making change The optimal solution, however, is two 36 dumbledore coinsGreedy algorithms 14 Definition A greedy algorithm is an algorithm which has: – A set of partial solutions from which a solution is built – An objective function which assigns a value to any partial solution Then given a partial solution, we – Consider possible extensions of the partial solution – Discard any extensions which are not feasible – Choose that extension which minimizes the object function This continues until some criteria has been reachedGreedy algorithms 15 Optimal example Prim’s algorithm is a greedy algorithm: – Any connected subgraph of k vertices and k– 1 edges is a partial solution – The value to any partial solution is the sum of the weights of the edges Then given a partial solution, we – Add that edge which does not create a cycle in the partial solution and which minimizes the increase in the total weight – We continue building the partial solution until the partial solution has n vertices – An optimal solution is foundGreedy algorithms 16 Optimal example Dijkstra’s algorithm is a greedy algorithm: – A subset of k vertices and known the minimum distance to all k vertices is a partial solution Then given a partial solution, we – Add that edge which is smallest which connects a vertex to which the minimum distance is known and a vertex to which the minimum distance is not known – We define the distance to that new vertex to be the distance to the known vertex plus the weight of the connecting edge – We continue building the partial solution until either: • The minimum distance to a specific vertex is known, or • The minimum distance to all vertices is known – An optimal solution is foundGreedy algorithms 17 Optimal and suboptimal examples Our coin change example is greedy: – Any subset of k coins is a partial solution – The value to any partial solution is the sum of the values Then given a partial solution, we – Add that coin which maximizes the increase in value without going over the target value We continue building the set of coins until we have reached the target value An optimal solution is found with euros and cents, but not with our quadratic dumbledore coins – It fails 29 out of 99 times: 12 18 19 22 23 32 41 42 43 48 52 56 61 64 67 68 70 71 72 73 76 77 80 81 88 90 91 92 97Greedy algorithms 18 Optimal and suboptimal examples An implementation of the greedy algorithm is straightforward: void greedy( int value, int coins, int rep, int n ) for ( int i = n 1; i = 0; i ) repi = 0; while ( coinsi = value ) value = coinsi; ++( repi ); //++repi also works Greedy algorithms 19 Optimal and suboptimal examples Determining whether n denominations of coins will allow a greedy algorithm to minimize change is difficult—there are no easy rules 3 – The Pearson test is an O(n ) algorithm which returns either 0 or a value for which the greedy algorithm fails int pearson( int coins, int n ) int m = 0, rep1n, rep2n; for ( int j = 0; j n 2; ++j ) for ( int i = j; i n 2; ++i ) greedy( coinsi + 1 1, coins, rep1, n ); ++( rep1j ); for ( int k = 0; k j 2; ++k ) rep1k = 0; int r = 0; for ( int k = 0; k n; ++k ) r += rep1kcoinsk; if ( m == 0 r m ) greedy( r, coins, rep2, n ); int sum1 = 0; for ( int k = 0; k n; ++k ) sum1 += rep1k; int sum2 = 0; for ( int k = 0; k n; ++k ) sum2 += rep2k; if ( sum2 sum1 ) m = r; return m; Jeffrey Shallit, What this Country Needs is an 18¢ Piece. Greedy algorithms 20 Unfeasible example In some cases, it may be possible that not even a feasible solution is found – Consider the following greedy algorithm for solving Sudoku: – For each empty square, starting at the topleft corner and going across: • Fill that square with the smallest number which does not violate any of our conditions • All feasible solutions have equal weightGreedy algorithms 21 Unfeasible example Let’s try this example the previously seen Sudoku square:Greedy algorithms 22 Unfeasible example Neither 1 nor 2 fits into the first empty square, so we fill it with 3Greedy algorithms 23 Unfeasible example The second empty square may be filled with 1Greedy algorithms 24 Unfeasible example rd And the 3 empty square may be filled with 4Greedy algorithms 25 Unfeasible example th At this point, we try to fill in the 4 empty squareGreedy algorithms 26 Unfeasible example Unfortunately, all nine numbers 1 – 9 already appear in such a way to block it from appearing in that square – There is no known greedy algorithm which finds the one feasible solutionGreedy algorithms 27 Traveling salesman problem Suppose you want to cycle through n cities without visiting the same city twice The Traveling Salesman Problem – It is possible to go from any one city to another The nearest neighbor algorithm is greedy: – Go to the closest city which has not yet been visited This will find a solution, but it is unlikely to be optimal – Reasonable with Euclidean distances • With random distributions of cities, on average the solution is 125 of the optimal solution – It can be made to find the worst possible path with constructed examples for nonEuclidean distancesGreedy algorithms 28 Linear programming A linear programming program attempts to optimize an nvariables linear objective function subject to constraints on those variables 3.5  T – For example, maximize 3.5x + 4.7y + 6.2zcv  c 4.7 subject to the constraints   6.2  4.7 2.1 3.6  4.7x + 2.1y + 3.6z≤ 6.3  Avb 1.9x + 1.4y + 3.1z≤ 5.1 1.9 1.4 3.1  v0 3.2x + 1.5y≤ 5.6  A 3.2 1.5 0  9.2x + 4.2z≤ 8.1 9.2 0 4.2 6.3   8.2y + 4.5z≤ 4.7   0 8.2 4.7 5.1  x≥ 0   b 5.6 y≥ 0  z≥ 0 8.1   4.7 Greedy algorithms 29 Linear programming Such linear constraints define polytopes in ndimensional space – All points within the polytope are feasible • They satisfy all constraints – The point at which the objective function is reached is at a vertex Four constraints in two variables Twentyfive constraints in three variables Greedy algorithms 30 Linear programming The simplex method starts at a vertex and moves to the adjacent vertex which maximizes the objective function – For most realworld problems, the run time is O(n) n – The worstcase scenario is Q(2 )Greedy algorithms 31 Optimal substructure Can we ever prove that a greed algorithm will work efficient A problem has an optimal substructure if an optimal solution can be constructed efficiently from optimal solutions of its subproblemsGreedy algorithms 32 Nearoptimal algorithms We have seen: – A greedy change algorithm which works under certain conditions – Prim’s and Dijkstra’s algorithms which are greedy and find the optimal solution – A naïve greedy algorithm which attempts (and fails) to solve Sudoku – The nearest neighbor algorithm is unlikely to find the optimal solution Next, we will see a greedy algorithm which finds a feasible, but not necessarily an optimal, solutionGreedy algorithms 33 Project management 0/1 knapsack problem Situation: – The next cycle for a given product is 26 weeks – We have ten possible projects which could be completed in that time, each with an expected number of weeks to complete the project and an expected increase in revenue This is also called the 0/1 knapsack problem – You can place n items in a knapsack where each item has a value in rupees and a weight in kilograms – The knapsack can hold a maximum of m kilogramsGreedy algorithms 34 Project management 0/1 knapsack problem Objective: – As project manager, choose those projects which can be completed in the required amount of time which maximizes revenue Maplesoft’s mostcool project managerGreedy algorithms 35 Project management 0/1 knapsack problem The projects: Completion Expected Revenue Product ID (1000 ) Time (wks) A 15 210 B 12 220 C 10 180 D 9 120 E 8 160 F 7 170 G 5 90 H 4 40 J 3 60 K 1 10Greedy algorithms 36 Project management 0/1 knapsack problem Let us first try to find an optimal schedule by trying to be as productive as possible during the 26 weeks: – we will start with the projects in order from most time to least time, and at each step, select the longestrunning project which does not put us over 26 weeks – we will be able to fill in the gaps with the smaller projects Greedy algorithms 37 Project management 0/1 knapsack problem Greedybytime (make use of all 26 wks): – Project A: 15 wks Product Completion Expected Revenue – Project C: 10 wks ID Time (wks) (1000 ) – Project J: 1 wk A 15 210 B 12 220 Total time: 26 wks C 10 180 D 9 120 Expected revenue: 400 000 E 8 160 F 7 170 G 5 90 H 4 40 I 3 60 J 1 10Greedy algorithms 38 Project management 0/1 knapsack problem Next, let us attempt to find an optimal schedule by starting with the most : – we will start with the projects in order from most time to least time, and at each step, select the longestrunning project which does not put us over 26 weeks – we will be able to fill in the gaps with the smaller projects Greedy algorithms 39 Project management 0/1 knapsack problem Greedybyrevenue (bestpaying projects): – Project B: 220K Product Completion Expected Revenue – Project C: 180K ID Time (wks) (1000 ) – Project H: 60K – Project K: 10K B 12 220 A 15 210 Total time: 26 wks C 10 180 F 7 170 Expected revenue: E 8 160 470 000 D 9 120 G 5 90 J 3 60 H 4 40 K 1 10Greedy algorithms 40 Project management 0/1 knapsack problem Unfortunately, either of these techniques focuses on projects which have high projected revenues or high run times What we really want is to be able to complete those jobs which pay the most per unit of development time Thus, rather than using development time or revenue, let us calculate the expected revenue per week of development timeGreedy algorithms 41 Project management 0/1 knapsack problem This is summarized here: Product Completion Expected Revenue Revenue Density ID Time (wks) (1000 ) ( / wk) A 15 210 14 000 B 12 220 18 333 C 10 180 18 000 D 9 120 13 333 E 8 160 20 000 F 7 170 24 286 G 5 90 18 000 H 4 40 10 000 J 3 60 20 000 K 1 10 10 000Greedy algorithms 42 Project management 0/1 knapsack problem Greedybyrevenuedensity: – Project F: 24 286/wk Revenue Expected Product Completion – Project E: 20 000/wk Density Revenue ID Time (wks) – Project J: 20 000/wk (1000 ) (/wk) – Project G: 18 000/wk F 7 170 24 286 – Project K: 10 000/wk E 8 160 20 000 J 3 60 20 000 Total time: 24 wks B 12 220 18 333 C 10 180 18 000 Expected revenue: G 5 90 18 000 490 000 A 15 210 14 000 D 9 120 13 333 Bonus: 2 weeks for bug fixing H 4 40 10 000 K 1 10 10 000Greedy algorithms 43 Project management 0/1 knapsack problem Using brute force, we find that the optimal solution is: – Project C: 180 000 Revenue Expected Product Completion – Project E: 170 000 Density Revenue ID Time (wks) – Project F: 150 000 (1000 ) (/wk) – Project K: 10 000 A 15 210 14 000 B 12 220 18 333 Total time: 26 wks C 10 180 18 000 D 9 120 13 333 Expected revenue: E 8 160 20 000 520 000 F 7 170 24 286 G 5 90 18 000 H 4 40 10 000 J 3 60 20 000 K 1 10 10 000Greedy algorithms 44 Project management 0/1 knapsack problem In this case, the greedybyrevenuedensity came closest to the optimal solution: Expected Algorithm Revenue Greedybytime 400 000 Greedybyexpected revenue 470 000 Greedybyrevenue density 490 000 Brute force 520 000 – The run time is Q(n ln(n)) — the time required to sort the list – Later, we will see a dynamic program for finding an optimal solution with one additional constraintGreedy algorithms 45 Project management 0/1 knapsack problem Of course, in reality, there are numerous other factors affecting projects, including: – Flexible deadlines (if a delay by a week would result in a significant increase in expected revenue, this would be acceptable) – Probability of success for particular projects – The requirement for banner projects • Note that greedybyrevenuedensity had none of the larger projects To demonstrate that this works in general, an implementation exists at: http://ece.uwaterloo.ca/ece250/Algorithms/Projectscheduling/Greedy algorithms 46 Process scheduling The primary resource in any computer is the processing unit One process (a running program) can run on a processor at any one time – single process—single processor – multiple processes—single processor – multiple processes—multiple processors – multiple threads—single, multiple, or multicore processorsGreedy algorithms 47 Process scheduling Multiprogramming – running processes in batches Cooperative multitasking/timesharing – processes voluntarily (through a system command) give up the processor – this requires careful programming...good luck – some of you may remember Windows 3.1Greedy algorithms 48 Process scheduling Preemptive multitasking/timesharing – the operating system controls access to the processor – processes may be preempted (time slices, interrupts) Real Time – addition of priorities, guarantees, etc.Greedy algorithms 49 Process scheduling Suppose we have N processes with known run times which are scheduled to run on a single processor This may occur either: – in an embedded system where the system is known beforehand, or – from past runs, the average processor usage has been tracked Suppose we want to minimize the total wait time for the processes to completeGreedy algorithms 50 Process scheduling Consider the following processes: Process (i) Time (t ) i 1 15 ms 2 8 ms 3 3 ms 4 10 ms nd Ref: Weiss, DSAA in C++, 2 Ed., p.410Greedy algorithms 51 Process scheduling If we scheduled them according to process number, we would get the following schedule: The total wait time is 15 + 23 + 26 + 36 = 100 msGreedy algorithms 52 Process scheduling This is not the optimal schedule If instead we choose a greedy algorithm which chooses the process with the shortest run time next, we get: The total wait time is 3 + 11 + 21 + 36 = 71 msGreedy algorithms 53 Process scheduling Intuitively, you know the answer: – You have 1 L of milk • 30 s to ring it in and pay – Another person has a full cart • 10 min to ring it in and pay If they go first, they wait 10:00 and you wait 10:30 – Total wait time: 20:30 If you go first, you wait 0:30 and they wait 10:30 – Total wait time: 11:00 http://www.ehow.com/Greedy algorithms 54 Process scheduling In this case, the greedy solution provides the optimal solution – We can show this mathematically Let i , i , i , i be a permutation of the process numbers 1, 2, 3, 4 1 2 3 4 For example, if we order the processes 3, 1, 4, 2 then i = 3, i = 1, i = 4, andi = 2 1 2 3 4 and t t ,t t ,t t , and t t i 3 i 1 i 4 i 2 1 2 3 4Greedy algorithms 55 Process scheduling The process time for each of the processes is summarized in this table: Process Time t i i 1 1 tt ii i 12 2 t t t i i i i 1 2 3 3 t t tt i i i i i 4 1 2 3 4 Sum 4t3t 2tt i i i i 1 2 3 4Greedy algorithms 56 Process scheduling We can write this sum for an arbitrary number of processes N as: N (N k 1)t  i k k1 which can be expanded into NN (N 1) t kt  ii kk kk  11 This is constant This may change: 3 × 7 + 4 × 5 3 × 5 + 4 × 7Greedy algorithms 57 Process scheduling To minimize the total wait time, we must maximize N kt  i k k1 Choose any two 1  j k N and consider jtkt jt jt jtkt  i i i i i i j k j k k k  j tt k j t   i i i j k k – The first term does not depend on the order of t or , but the second t i i j k does tt – (k– j) 0 and to maximize the second term, we require ii kj tttt Thus must be true for all pairs, thus i i i i 1 2 3 NGreedy algorithms 58 Process scheduling To quickly demonstrate this, suppose we take the two processes with times 8.4 and 10.7 ms 2·10.7 ms + 3·8.4 ms = 46.6 ms 2·8.4 ms + 3·10.7 ms = 48.9 ms Thus, the optimal ordering must be shortestprocess first This same result holds if we have multiple processors: – If we want to schedule N processes on M processors, if we want to minimize the total wait time, we order the processes from shortest completion time to longest and schedule them cyclicallyGreedy algorithms 59 Process scheduling For example, given 12 processes and three processors, we could schedule the processes as follows:Greedy algorithms 60 Process scheduling One problem which cannot be solved using a greedy algorithm is minimizing the final completion time: – given N processes, minimize the overall time required to complete all processes This is in a class of problems termed NPcomplete, which we will look at laterGreedy algorithms 61 Process scheduling For example, consider the processes and completion times listed in this table Time Process (ms) Suppose we can run these 1 3 processes on three different 2 5 processors (assuming that the processes are not 3 6 interdependent) 4 10 5 11 6 14 7 15 8 18 9 20Greedy algorithms 62 Process scheduling Minimizing the average wait time, we assign the processes cyclically: The total wait time is 156 ms and therefore the average wait time is 156 ms/9 = 17.333 msGreedy algorithms 63 Process scheduling Suppose, however, we want to minimize the final completion time, we require: The total wait time is longer: – 168 ms versus 156 ms nd This is a difficult (NPcomplete) problem (2 last topic)Greedy algorithms 64 Process scheduling Scheduling processes is covered in greater detail in ECE 254 Operating Systems and Systems Programming This will include numerous other (often greedy) schemes for scheduling as well as preemptive multitasking and realtime constraintsGreedy algorithms 65 Interval scheduling Suppose we have a list of processes, each of which must run in a given time interval: – e.g., process A must run during 2:005:00 process B must run during 4:009:00 process C must run during 6:008:00 Clearly, not all three processes can be run – Applications in ECE 254 Operating Systems and Systems ProgrammingGreedy algorithms 66 Interval scheduling Suppose we want to maximize the number of processes that are run In order to create a greedy algorithm, we must have a fast selection process which quickly determines which process should be run next The first thought may be to always run that process that is next ready to run – A little thought, however, quickly demonstrates that this fails – The worst case would be to only run 1 out of n possible processes when n– 1 processes could have been runGreedy algorithms 67 Interval scheduling To maximize the number of processes that are run, we should trying to free up the processor as quickly as possible – Instead of looking at the start times, look at the end times – At any time that the processor is available, select that process with the earliest end time: the earliestdeadlinefirst algorithm In this example, Process B is the first to start, and then Process C follows:Greedy algorithms 68 Interval scheduling Process Interval Consider the following list of 12 processes A 5 – 8 together with the time interval during which 10 – 13 B they must be run C 6 – 9 – Find the optimal schedule with the earliest deadlinefirst greedy algorithm 12 – 15 D E 3 – 7 8 – 11 F 1 – 6 G H 8 – 12 3 – 5 J K 2 – 4 11 – 16 L M 10 – 15Greedy algorithms 69 Interval scheduling Process Interval In order to simplify this, sort the processes K 2 – 4 on their end times 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 6 – 9 C 8 – 11 F H 8 – 12 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 70 Interval scheduling Process Interval To begin, choose Process K K 2 – 4 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 6 – 9 C 8 – 11 F H 8 – 12 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 71 Interval scheduling Process Interval At this point, Process J, G and E can no K 2 – 4 longer be run 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 6 – 9 C 8 – 11 F H 8 – 12 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 72 Interval scheduling Process Interval Next, run Process A K 2 – 4 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 6 – 9 C 8 – 11 F H 8 – 12 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 73 Interval scheduling Process Interval We can no longer run Process C K 2 – 4 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 6 – 9 C 8 – 11 F H 8 – 12 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 74 Interval scheduling Process Interval Next, we can run Process F K 2 – 4 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 6 – 9 C 8 – 11 F H 8 – 12 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 75 Interval scheduling Process Interval This restricts us from running K 2 – 4 Processes H, B and M 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 6 – 9 C 8 – 11 F H 8 – 12 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 76 Interval scheduling Process Interval The next available process is D K 2 – 4 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 6 – 9 C 8 – 11 F H 8 – 12 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 77 Interval scheduling Process Interval The prevents us from running Process L K 2 – 4 – We are therefore finished 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 6 – 9 C 8 – 11 F H 8 – 12 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 78 Application: Interval scheduling Process Interval We have scheduled four processes K 2 – 4 – The selection may not be unique 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 Once the processes are sorted, the run time 6 – 9 C is linear—we simply look ahead to find the 8 – 11 F next process that can be run H 8 – 12 – Thus, the run time is the run time of sorting the 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 79 Application: Interval scheduling Process Interval For example, we could have chosen K 2 – 4 Process L 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 In this case, processor usage would go 6 – 9 C up, but no significance is given to that 8 – 11 F criteria H 8 – 12 10 – 13 B D 12 – 15 10 – 15 M L 11 – 16Greedy algorithms 80 Application: Interval scheduling Process Interval We could add weights to the individual K 2 – 4 processes 3 – 5 J G 1 – 6 3 – 7 E A 5 – 8 – The weights could be the duration of 6 – 9 C the processes—maximize processor usage 8 – 11 F – The weights could be revenue gained from H 8 – 12 the performance—maximize revenue 10 – 13 B D 12 – 15 We will see an efficient algorithm in the 10 – 15 M topic on dynamic programming L 11 – 16Greedy algorithms 81 Summary We have seen the algorithmdesign technique, namely greedy algorithms – For some problems, appropriatelydesigned greedy algorithms may find either optimal or nearoptimal solutions – For other problems, greedy algorithms may a poor result or even no result at all Their desirable characteristic is speed Greedy algorithms 82 References Wikipedia, http://en.wikipedia.org/wiki/Algorithmdesign These slides are provided for the ECE 250 Algorithms and Data Structures course. The material in it reflects Douglas W. Harder’s best judgment in light of the information available to him at the time of preparation. Any reliance on these course slides by any party for any other purpose are the responsibility of such parties. Douglas W. Harder accepts no responsibility for damages, if any, suffered by any party as a result of decisions made or actions based on these course slides for any other purpose than that for which it was intended.