Approximation algorithms ppt

integer programming approximation algorithms ppt and np completeness and approximation algorithms ppt
Dr.AlexanderTyler Profile Pic
Dr.AlexanderTyler,India,Teacher
Published Date:21-07-2017
Your Website URL(Optional)
Comment
11. APPROXIMATION ALGORITHMS load balancing ‣ center selection ‣ pricing method: vertex cover ‣ LP rounding: vertex cover ‣ generalized load balancing ‣ knapsack problem ‣ Lecture slides by Kevin Wayne
 Copyright © 2005 Pearson-Addison Wesley
 http://www.cs.princeton.edu/wayne/kleinberg-tardos Last updated on 2/15/17 6:00 PMCoping with NP-completeness Q. Suppose I need to solve an NP-hard problem. What should I do? 
 A. Sacrifice one of three desired features. i. Solve arbitrary instances of the problem. ii. Solve problem to optimality. iii. Solve problem in polynomial time. ρ-approximation algorithm. Guaranteed to run in poly-time. Guaranteed to solve arbitrary instance of the problem Guaranteed to find solution within ratio ρ of true optimum. Challenge. Need to prove a solution's value is close to optimum,
 without even knowing what optimum value is 211. APPROXIMATION ALGORITHMS load balancing ‣ center selection ‣ pricing method: vertex cover ‣ LP rounding: vertex cover ‣ generalized load balancing ‣ knapsack problem ‣Load balancing Input. m identical machines; n jobs, job j has processing time t . j Job j must run contiguously on one machine. A machine can process at most one job at a time. Def. Let J(i) be the subset of jobs assigned to machine i.
 The load of machine i is L = Σ t . ∈ i j J(i) j Def. The makespan is the maximum load on any machine L = max L . i i Load balancing. Assign each job to a machine to minimize makespan. machine 1 a d f Machine 1 b c Machine 2 e g machine 2 0 time L L 2 1 4Load balancing on 2 machines is NP-hard Claim. Load balancing is hard even if only 2 machines. Pf. PARTITION ≤ LOAD-BALANCE. P NP-complete by Exercise 8.26 a b c d e f g length of job f machine 1 a d f Machine 1 yes b c Machine 2 e g machine 2 0 time L 5Load balancing: list scheduling List-scheduling algorithm. Consider n jobs in some fixed order. Assign job j to machine whose load is smallest so far. List-Scheduling(m, n, t ,t ,…,t ) 1 2 n for i = 1 to m L ← 0 load on machine i i J(i) ← ∅ jobs assigned to machine i for j = 1 to n machine i has smallest load i = argmin L k k assign job j to machine i J(i) ← J(i) ∪ j update load of machine i L ← L + t i i j return J(1), …, J(m) Implementation. O(n log m) using a priority queue. 6Load balancing: list scheduling analysis Theorem. Graham 1966 Greedy algorithm is a 2-approximation. First worst-case analysis of an approximation algorithm. Need to compare resulting solution with optimal makespan L. Lemma 1. The optimal makespan L ≥ max t . j j Pf. Some machine must process the most time-consuming job. ▪ 1 Lemma 2. The optimal makespan L ≥ t . ∑ j j m Pf. The total processing time is Σ t . j j One of m machines must do at least a 1 / m fraction of total work. ▪ € 7Believe it or not 1 L ≥ t . ∑ j j m € 8Load balancing: list scheduling analysis Theorem. Greedy algorithm is a 2-approximation. Pf. Consider load L of bottleneck machine i. i Let j be last job scheduled on machine i. When job j assigned to machine i, i had smallest load.
 Its load before assignment is L – t ⇒ L – t ≤ L for all 1 ≤ k ≤ m. i j i j k blue jobs scheduled before j j machine i 0 time L = L L - t i i j 9Load balancing: list scheduling analysis Theorem. Greedy algorithm is a 2-approximation. Pf. Consider load L of bottleneck machine i. i Let j be last job scheduled on machine i. When job j assigned to machine i, i had smallest load.
 Its load before assignment is L – t ⇒ L – t ≤ L for all 1 ≤ k ≤ m. i j i j k Sum inequalities over all k and divide by m: 1 L − t ≤ L ∑ i j k k m 1 = t ∑ k m k Lemma 2 ≤ L Now, L = ▪ L = (L −t ) + t ≤ 2L . i i j j % " € ≤ L ≤ L Lemma 1 € 10Load balancing: list scheduling analysis Q. Is our analysis tight? A. Essentially yes. Ex: m machines, m (m – 1) jobs length 1 jobs, one job of length m. list scheduling makespan = 19 = 2m - 1 machine 2 idle machine 3 idle machine 4 idle m = 10 machine 5 idle machine 6 idle machine 7 idle machine 8 idle machine 9 idle machine 10 idle 0 9 19 11Load balancing: list scheduling analysis Q. Is our analysis tight? A. Essentially yes. Ex: m machines, m (m – 1) jobs length 1 jobs, one job of length m. optimal makespan = 10 = m m = 10 0 9 10 19 12Load balancing: LPT rule Longest processing time (LPT). Sort n jobs in descending order of processing time, and then run list scheduling algorithm. LPT-List-Scheduling(m, n, t ,t ,…,t ) 1 2 n Sort jobs so that t ≥ t ≥ … ≥ t 1 2 n for i = 1 to m L ← 0 load on machine i i jobs assigned to machine i J(i) ← ∅ for j = 1 to n machine i has smallest load i = argmin L k k assign job j to machine i J(i) ← J(i) ∪ j update load of machine i L ← L + t i i j return J(1), …, J(m) 13Load balancing: LPT rule Observation. If at most m jobs, then list-scheduling is optimal. Pf. Each job put on its own machine. ▪ Lemma 3. If there are more than m jobs, L ≥ 2 t . m+1 Pf. Consider first m+1 jobs t , …, t . 1 m+1 Since the t 's are in descending order, each takes at least t time. i m+1 There are m + 1 jobs and m machines, so by pigeonhole principle,
 at least one machine gets two jobs. ▪ Theorem. LPT rule is a 3/2-approximation algorithm. Pf. Same basic approach as for list scheduling. 3 L = (L − t ) + t ≤ L . i i j j 2 ▪ % " 1 ≤ L ≤ L 2 Lemma 3
 ( by observation, can assume number of jobs m ) 14 € Load Balancing: LPT rule Q. Is our 3/2 analysis tight? A. No. Theorem. Graham 1969 LPT rule is a 4/3-approximation. Pf. More sophisticated analysis of same algorithm. Q. Is Graham's 4/3 analysis tight? A. Essentially yes. Ex. m machines n = 2m + 1 jobs 2 jobs of length m, m + 1, …, 2m – 1 and one more job of length m. Then, L / L = (4m − 1) / (3m) 1511. APPROXIMATION ALGORITHMS load balancing ‣ center selection ‣ pricing method: vertex cover ‣ LP rounding: vertex cover ‣ generalized load balancing ‣ knapsack problem ‣Center selection problem Input. Set of n sites s , …, s and an integer k 0. 1 n Center selection problem. Select set of k centers C so that maximum distance r(C) from a site to nearest center is minimized. k = 4 centers r(C) center site 17Center selection problem Input. Set of n sites s , …, s and an integer k 0. 1 n Center selection problem. Select set of k centers C so that maximum distance r(C) from a site to nearest center is minimized. Notation. dist(x, y) = distance between sites x and y. dist(s , C) = min dist(s , c) = distance from s to closest center. ∈ i c C i i r(C) = max dist(s , C) = smallest covering radius. i i Goal. Find set of centers C that minimizes r(C), subject to C = k. Distance function properties. dist(x, x) = 0 identity dist(x, y) = dist(y, x) symmetry dist(x, y) ≤ dist(x, z) + dist(z, y) triangle inequality 18Center selection example Ex: each site is a point in the plane, a center can be any point in the plane, dist(x, y) = Euclidean distance. Remark: search can be infinite k = 4 centers r(C) center site 19Greedy algorithm: a false start Greedy algorithm. Put the first center at the best possible location for a single center, and then keep adding centers so as to reduce the covering radius each time by as much as possible. Remark: arbitrarily bad k = 2 centers greedy center 1 center site 20