! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. !-approximation algorithm.
|
|
|
- Silas Russell
- 10 years ago
- Views:
Transcription
1 Approximation Algorithms Chapter Approximation Algorithms Q Suppose I need to solve an NP-hard problem What should I do? A Theory says you're unlikely to find a poly-time algorithm Must sacrifice one of three desired features Solve problem to optimality Solve problem in poly-time Solve arbitrary instances of the problem -approximation algorithm Guaranteed to run in poly-time Guaranteed to solve arbitrary instance of the problem Guaranteed to find solution within ratio of true optimum Slides by Kevin Wayne 005 Pearson-Addison Wesley All rights reserved Challenge Need to prove a solution's value is close to optimum, without even knowing what optimum value is Load Balancing Load Balancing Input m identical machines; n jobs, job j has processing time t j Job j must run contiguously on one machine A machine can process at most one job at a time Def Let J(i) be the subset of jobs assigned to machine i The load of machine i is L i = " j # J(i) t j Def The makespan is the maximum load on any machine L = max i L i Load balancing Assign each job to a machine to minimize makespan
2 Load Balancing: List Scheduling Load Balancing: List Scheduling Analysis List-scheduling algorithm Consider n jobs in some fixed order Assign job j to machine whose load is smallest so far Theorem [Graham, 966] Greedy algorithm is a -approximation First worst-case analysis of an approximation algorithm Need to compare resulting solution with optimal makespan L* List-Scheduling(m, n, t,t,,t n ) { for i = to m { L i % 0 load on machine i J(i) % & jobs assigned to machine i for j = to n { i = argmin k L k J(i) % J(i) ' {j L i % L i + t j machine i has smallest load assign job j to machine i update load of machine i Lemma The optimal makespan L* $ max j t j Pf Some machine must process the most time-consuming job Lemma The optimal makespan Pf The total processing time is " j t j L * " m # j t j One of m machines must do at least a /m fraction of total work Implementation O(n log n) using a priority queue 5 6 Load Balancing: List Scheduling Analysis Load Balancing: List Scheduling Analysis Theorem Greedy algorithm is a -approximation Pf Consider load L i of bottleneck machine i Let j be last job scheduled on machine i When job j assigned to machine i, i had smallest load Its load before assignment is L i - t j ( L i - t j ) L k for all ) k ) m Theorem Greedy algorithm is a -approximation Pf Consider load L i of bottleneck machine i Let j be last job scheduled on machine i When job j assigned to machine i, i had smallest load Its load before assignment is L i - t j ( L i - t j ) L k for all ) k ) m Sum inequalities over all k and divide by m: blue jobs scheduled before j Lemma L i " t j # m $ k L k = m $ k t k # L * machine i j Now L i = (L i " t j ) 3 + t j # L * { # L* # L* Lemma 0 L i - t j L = L i 7 8
3 Load Balancing: List Scheduling Analysis Load Balancing: List Scheduling Analysis Q Is our analysis tight? A Essentially yes Ex: m machines, m(m-) jobs length jobs, one job of length m Q Is our analysis tight? A Essentially yes Ex: m machines, m(m-) jobs length jobs, one job of length m m = 0 machine idle machine 3 idle machine idle machine 5 idle machine 6 idle machine 7 idle machine 8 idle machine 9 idle machine 0 idle m = 0 list scheduling makespan = 9 optimal makespan = Load Balancing: LPT Rule Load Balancing: LPT Rule Longest processing time (LPT) Sort n jobs in descending order of processing time, and then run list scheduling algorithm Observation If at most m jobs, then list-scheduling is optimal Pf Each job put on its own machine LPT-List-Scheduling(m, n, t,t,,t n ) { Sort jobs so that t t t n for i = to m { L i % 0 J(i) % & for j = to n { i = argmin k L k J(i) % J(i) ' {j L i % L i + t j load on machine i jobs assigned to machine i machine i has smallest load assign job j to machine i update load of machine i Lemma 3 If there are more than m jobs, L* $ t m+ Pf Consider first m+ jobs t,, t m+ Since the t i 's are in descending order, each takes at least t m+ time There are m+ jobs and m machines, so by pigeonhole principle, at least one machine gets two jobs Theorem LPT rule is a 3/ approximation algorithm Pf Same basic approach as for list scheduling L i = (L i " t j ) 3 + t j { # 3 L * # L* # L* Lemma 3 ( by observation, can assume number of jobs > m )
4 Load Balancing: LPT Rule Q Is our 3/ analysis tight? A No Center Selection Theorem [Graham, 969] LPT rule is a /3-approximation Pf More sophisticated analysis of same algorithm Q Is Graham's /3 analysis tight? A Essentially yes Ex: m machines, n = m+ jobs, jobs of length m+, m+,, m- and one job of length m 3 Center Selection Problem Center Selection Problem Input Set of n sites s,, s n Center selection problem Select k centers C so that maximum distance from a site to nearest center is minimized Input Set of n sites s,, s n Center selection problem Select k centers C so that maximum distance from a site to nearest center is minimized r(c) k = Notation dist(x, y) = distance between x and y dist(s i, C) = min c # C dist(s i, c) = distance from s i to closest center r(c) = max i dist(s i, C) = smallest covering radius Goal Find set of centers C that minimizes r(c), subject to C = k center site Distance function properties dist(x, x) = 0 (identity) dist(x, y) = dist(y, x) (symmetry) dist(x, y) ) dist(x, z) + dist(z, y) (triangle inequality) 5 6
5 Center Selection Example Greedy Algorithm: A False Start Ex: each site is a point in the plane, a center can be any point in the plane, dist(x, y) = Euclidean distance Remark: search can be infinite Greedy algorithm Put the first center at the best possible location for a single center, and then keep adding centers so as to reduce the covering radius each time by as much as possible Remark: arbitrarily bad r(c) greedy center k = centers center site center site 7 8 Center Selection: Greedy Algorithm Center Selection: Analysis of Greedy Algorithm Greedy algorithm Repeatedly choose the next center to be the site farthest from any existing center Greedy-Center-Selection(k, n, s,s,,s n ) { C = & repeat k times { Select a site s i with maximum dist(s i, C) Add s i to C site farthest from any center return C Theorem Let C* be an optimal set of centers Then r(c) ) r(c*) Pf (by contradiction) Assume r(c*) < r(c) For each site c i in C, consider ball of radius r(c) around it Exactly one c i * in each ball; let c i be the site paired with c i * Consider any site s and its closest center c i * in C* dist(s, C) ) dist(s, c i ) ) dist(s, c i *) + dist(c i *, c i ) ) r(c*) Thus r(c) ) r(c*) r(c) *-inequality ) r(c*) since c i * is closest center r(c) Observation Upon termination all centers in C are pairwise at least r(c) apart Pf By construction of algorithm C* sites r(c) s c i c i * 9 0
6 Center Selection Theorem Let C* be an optimal set of centers Then r(c) ) r(c*) Theorem Greedy algorithm is a -approximation for center selection problem The Pricing Method: Vertex Cover Remark Greedy algorithm always places centers at sites, but is still within a factor of of best solution that is allowed to place centers anywhere eg, points in the plane Question Is there hope of a 3/-approximation? /3? Theorem Unless P = NP, there no -approximation for center-selection problem for any < Weighted Vertex Cover Weighted Vertex Cover Weighted vertex cover Given a graph G with vertex weights, find a vertex cover of minimum weight Pricing method Each edge must be covered by some vertex i Edge e pays price p e $ 0 to use vertex i Fairness Edges incident to vertex i should pay ) w i in total for each vertex i : " p e w e= ( i, j) i 9 9 weight = + + weight = 9 9 Claim For any vertex cover S and any fair prices p e : + e p e ) w(s) Proof " pe " " pe " wi = w( S) e# E i# S e= ( i, j) i# S each edge e covered by at least one node in S sum fairness inequalities for each node in S 3
7 Pricing Method Pricing Method Pricing method Set prices and find vertex cover simultaneously Weighted-Vertex-Cover-Approx(G, w) { foreach e in E p e = 0 p = w e e = ( i, j) i while (- edge i-j such that neither i nor j are tight) select such an edge e increase p e without violating fairness S % set of all tight nodes return S price of edge a-b vertex weight Figure Pricing Method: Analysis Theorem Pricing method is a -approximation Pf Algorithm terminates since at least one new node becomes tight after each iteration of while loop 6 LP Rounding: Vertex Cover Let S = set of all tight nodes upon termination of algorithm S is a vertex cover: if some edge i-j is uncovered, then neither i nor j is tight But then while loop would not terminate Let S* be optimal vertex cover We show w(s) ) w(s*) w(s) = # w i = # # p e $ i" S i" S e=(i, j) # # p e = # p e $ w(s*) i"v e=(i, j) e" E all nodes in S are tight S, V, prices $ 0 each edge counted twice fairness lemma 7
8 Weighted Vertex Cover Weighted Vertex Cover: IP Formulation Weighted vertex cover Given an undirected graph G = (V, E) with vertex weights w i $ 0, find a minimum weight subset of nodes S such that every edge is incident to at least one vertex in S Weighted vertex cover Given an undirected graph G = (V, E) with vertex weights w i $ 0, find a minimum weight subset of nodes S such that every edge is incident to at least one vertex in S 0 A 6F 9 Integer programming formulation Model inclusion of each vertex i using a 0/ variable x i 6 B 7G 0 x i = " # $ 0 if vertex i is not in vertex cover if vertex i is in vertex cover 6 3 3C D H I 9 33 Vertex covers in - correspondence with 0/ assignments: S = {i # V : x i = Objective function: maximize " i w i x i 7 E total weight = 55 0 J 3 Must take either i or j: x i + x j $ 9 30 Weighted Vertex Cover: IP Formulation Integer Programming Weighted vertex cover Integer programming formulation INTEGER-PROGRAMMING Given integers a ij and b i, find integers x j that satisfy: (ILP) min # w i x i i " V s t x i + x j $ (i, j) " E x i " {0, i " V max c t x s t Ax " b x integral n " a ij x j # b i $ i $ m j= x j # 0 $ j $ n x j integral $ j $ n Observation If x* is optimal solution to (ILP), then S = {i # V : x* i = is a min weight vertex cover Observation Vertex cover formulation proves that integer programming is NP-hard search problem even if all coefficients are 0/ and at most two variables per inequality 3 3
9 Linear Programming LP Feasible Region Linear programming Max/min linear objective function subject to linear inequalities Input: integers c j, b i, a ij Output: real numbers x j (P) max c t x s t Ax " b x " 0 (P) max " c j x j n j= n s t " a ij x j # b i $ i $ m j= x j # 0 $ j $ n LP geometry in D x = 0 Linear No x, xy, arccos(x), x(-x), etc Simplex algorithm [Dantzig 97] Can solve LP in practice Ellipsoid algorithm [Khachian 979] Can solve LP in poly-time x = 0 x + x = 6 x + x = Weighted Vertex Cover: LP Relaxation Weighted Vertex Cover Weighted vertex cover Linear programming formulation (LP) min # w i x i i " V s t x i + x j $ (i, j) " E x i $ 0 i " V Theorem If x* is optimal solution to (LP), then S = {i # V : x* i $ is a vertex cover whose weight is at most twice the min possible weight Pf [S is a vertex cover] Consider an edge (i, j) # E Since x* i + x* j $, either x* i $ or x* j $ ( (i, j) covered Observation Optimal value of (LP) is ) optimal value of (ILP) Pf LP has fewer constraints Note LP is not equivalent to vertex cover Pf [S has desired cost] Let S* be optimal vertex cover Then w i i " S* * # $ # w i x i $ w # i i " S LP is a relaxation x* i $ i " S Q How can solving LP help us find a small vertex cover? A Solve LP and round fractional values 35 36
10 Weighted Vertex Cover Theorem -approximation algorithm for weighted vertex cover * 7 Load Balancing Reloaded Theorem [Dinur-Safra 00] If P NP, then no -approximation for < 3607, even with unit weights 0 /5 - Open research problem Close the gap 37 Generalized Load Balancing Generalized Load Balancing: Integer Linear Program and Relaxation Input Set of m machines M; set of n jobs J Job j must run contiguously on an authorized machine in M j, M Job j has processing time t j Each machine can process at most one job at a time Def Let J(i) be the subset of jobs assigned to machine i The load of machine i is L i = " j # J(i) t j Def The makespan is the maximum load on any machine = max i L i Generalized load balancing Assign each job to an authorized machine to minimize makespan ILP formulation x ij = time machine i spends processing job j LP relaxation (IP) min (LP) min L s t " x i j = t j for all j # J i " $ L for all i # M x i j j x i j # {0, t j for all j # J and i # M j x i j = 0 for all j # J and i % M j L s t " x i j = t j for all j # J i " $ L for all i # M x i j j x i j % 0 for all j # J and i # M j x i j = 0 for all j # J and i & M j 39 0
11 Generalized Load Balancing: Lower Bounds Generalized Load Balancing: Structure of LP Solution Lemma Let L be the optimal value to the LP Then, the optimal makespan L* $ L Pf LP has fewer constraints than IP formulation Lemma The optimal makespan L* $ max j t j Pf Some machine must process the most time-consuming job Lemma 3 Let x be solution to LP Let G(x) be the graph with an edge from machine i to job j if x ij > 0 Then G(x) is acyclic Pf (deferred) can transform x into another LP solution where G(x) is acyclic if LP solver doesn't return such an x x ij > 0 G(x) acyclic job machine G(x) cyclic Generalized Load Balancing: Rounding Generalized Load Balancing: Analysis Rounded solution Find LP solution x where G(x) is a forest Root forest G(x) at some arbitrary machine node r If job j is a leaf node, assign j to its parent machine i If job j is not a leaf node, assign j to one of its children Lemma Rounded solution only assigns jobs to authorized machines Pf If job j is assigned to machine i, then x ij > 0 LP solution can only assign positive value to authorized machines Lemma 5 If job j is a leaf node and machine i = parent(j), then x ij = t j Pf Since i is a leaf, x ij = 0 for all j parent(i) LP constraint guarantees " i x ij = t j Lemma 6 At most one non-leaf job is assigned to a machine Pf The only possible non-leaf job assigned to machine i is parent(i) job machine job machine 3
12 Generalized Load Balancing: Analysis Generalized Load Balancing: Flow Formulation Theorem Rounded solution is a -approximation Pf Let J(i) be the jobs assigned to machine i By Lemma 6, the load L i on machine i has two components: leaf nodes t j j " J(i) j is a leaf Lemma 5 # = # x ij $ # x ij $ L $ L * j " J(i) j is a leaf Lemma j " J LP Lemma (LP is a relaxation) optimal value of LP Flow formulation of LP x i j i x i j j " = t j for all j # J " $ L for all i # M x i j % 0 for all j # J and i # M j x i j = 0 for all j # J and i & M j 0 parent(i) t parent(i) " L * Thus, the overall load L i ) L* Observation Solution to feasible flow problem with value L are in oneto-one correspondence with LP solutions of value L 5 6 Generalized Load Balancing: Structure of Solution Conclusions Lemma 3 Let (x, L) be solution to LP Let G(x) be the graph with an edge from machine i to job j if x ij > 0 We can find another solution (x', L) such that G(x') is acyclic Pf Let C be a cycle in G(x) Augment flow along the cycle C At least one edge from C is removed (and none are added) Repeat until G(x') is acyclic 3 3 flow conservation maintained 3 3 Running time The bottleneck operation in our -approximation is solving one LP with mn + variables Remark Can solve LP using flow techniques on a graph with m+n+ nodes: given L, find feasible flow if it exists Binary search to find L* Extensions: unrelated parallel machines [Lenstra-Shmoys-Tardos 990] Job j takes t ij time if processed on machine i -approximation algorithm via LP rounding No 3/-approximation algorithm unless P = NP G(x) augment along C G(x') 7 8
13 Polynomial Time Approximation Scheme 8 Knapsack Problem PTAS ( + )-approximation algorithm for any constant > 0 Load balancing [Hochbaum-Shmoys 987] Euclidean TSP [Arora 996] Consequence PTAS produces arbitrarily high quality solution, but trades off accuracy for time This section PTAS for knapsack problem via rounding and scaling 50 Knapsack Problem Knapsack is NP-Complete Knapsack problem Given n objects and a "knapsack" Item i has value v i > 0 and weighs w i > 0 Knapsack can carry weight up to W we'll assume w i ) W Goal: fill knapsack so as to maximize total value Ex: { 3, has value 0 Item Value 6 Weight W = KNAPSACK: Given a finite set X, nonnegative weights w i, nonnegative values v i, a weight limit W, and a target value V, is there a subset S, X such that: # $ W w i i"s v i i"s # % V SUBSET-SUM: Given a finite set X, nonnegative values u i, and an integer U, is there a subset S, X whose elements sum to exactly U? Claim SUBSET-SUM ) P KNAPSACK Pf Given instance (u,, u n, U) of SUBSET-SUM, create KNAPSACK instance: v i = w i = u i # u i $ U i"s V = W = U # u i % U i"s 5 5
14 Knapsack Problem: Dynamic Programming Knapsack Problem: Dynamic Programming II Def OPT(i, w) = max value subset of items,, i with weight limit w Case : OPT does not select item i OPT selects best of,, i using up to weight limit w Case : OPT selects item i new weight limit = w w i OPT selects best of,, i using up to weight limit w w i # 0 if i = 0 % OPT(i, w) = $ OPT(i ", w) if w i > w % & max{ OPT(i ", w), v i + OPT(i ", w " w i ) otherwise Running time O(n W) W = weight limit Not polynomial in input size Def OPT(i, v) = min weight subset of items,, i that yields value exactly v Case : OPT does not select item i OPT selects best of,, i- that achieves exactly value v Case : OPT selects item i consumes weight w i, new value needed = v v i OPT selects best of,, i- that achieves exactly value v $ 0 if v = 0 & " if i = 0, v > 0 OPT(i, v) = % & OPT(i #, v) if v i > v '& min{ OPT(i #, v), w i + OPT(i #, v # v i ) otherwise V* ) n v max Running time O(n V*) = O(n v max ) V* = optimal value = maximum v such that OPT(n, v) ) W Not polynomial in input size 53 5 Knapsack: FPTAS Intuition for approximation algorithm Round all values up to lie in smaller range Run dynamic programming algorithm on rounded instance Return optimal items in rounded instance Knapsack: FPTAS Knapsack FPTAS Round up all values: v max = largest value in original instance = precision parameter = scaling factor = v max / n # v i = v i % $ " & ", v ˆ # i = v i % $ " & Item Value Weight 3, 656,3 3,80,03 5,7, ,33,99 7 W = original instance Item Value Weight W = rounded instance Observation Optimal solution to problems with or are equivalent Intuition close to v so optimal solution using is nearly optimal; small and integral so dynamic programming algorithm is fast v ˆ v Running time O(n 3 / ) Dynamic program II running time is O(n v ˆ max ), where v ˆ max = # v max $ " % & = # n % $ ' & v v v ˆ 55 56
15 Knapsack: FPTAS Knapsack FPTAS Round up all values: # v i = v i % $ " & " Extra Slides Theorem If S is solution found by our algorithm and S* is any other feasible solution then (+") % v i # % v i i $ S i $ S* Pf Let S* be any feasible solution satisfying weight constraint v i i " S* # $ # v i i " S* $ # v i i " S $ # (v i + %) i " S always round up solve rounded instance optimally never round up by more than $ # v i + n% i" S S ) n DP alg can take v max $ (+&) # v i i" S n = v max, v max ) " i#s v i 57 Load Balancing on Machines Center Selection: Hardness of Approximation Claim Load balancing is hard even if only machines Pf NUMBER-PARTITIONING ) P LOAD-BALANCE Theorem Unless P = NP, there is no -approximation algorithm for metric k-center problem for any < machine machine e a NP-complete by Exercise 86 b a d Machine f b c Machine e g f length of job f c g d yes Pf We show how we could use a ( - ) approximation algorithm for k- center to solve DOMINATING-SET in poly-time Let G = (V, E), k be an instance of DOMINATING-SET Construct instance G' of k-center with sites V and distances d(u, v) = if (u, v) # E d(u, v) = if (u, v) 3 E Note that G' satisfies the triangle inequality Claim: G has dominating set of size k iff there exists k centers C* with r(c*) = see Exercise 89 Thus, if G has a dominating set of size k, a ( - )-approximation algorithm on G' must find a solution C* with r(c*) = since it cannot use any edge of distance 0 Time L 59 60
! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. #-approximation algorithm.
Approximation Algorithms 11 Approximation Algorithms Q Suppose I need to solve an NP-hard problem What should I do? A Theory says you're unlikely to find a poly-time algorithm Must sacrifice one of three
Chapter 11. 11.1 Load Balancing. Approximation Algorithms. Load Balancing. Load Balancing on 2 Machines. Load Balancing: Greedy Scheduling
Approximation Algorithms Chapter Approximation Algorithms Q. Suppose I need to solve an NP-hard problem. What should I do? A. Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one
11. APPROXIMATION ALGORITHMS
11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005
Algorithm Design and Analysis
Algorithm Design and Analysis LECTURE 27 Approximation Algorithms Load Balancing Weighted Vertex Cover Reminder: Fill out SRTEs online Don t forget to click submit Sofya Raskhodnikova 12/6/2011 S. Raskhodnikova;
Applied Algorithm Design Lecture 5
Applied Algorithm Design Lecture 5 Pietro Michiardi Eurecom Pietro Michiardi (Eurecom) Applied Algorithm Design Lecture 5 1 / 86 Approximation Algorithms Pietro Michiardi (Eurecom) Applied Algorithm Design
Approximation Algorithms
Approximation Algorithms or: How I Learned to Stop Worrying and Deal with NP-Completeness Ong Jit Sheng, Jonathan (A0073924B) March, 2012 Overview Key Results (I) General techniques: Greedy algorithms
Guessing Game: NP-Complete?
Guessing Game: NP-Complete? 1. LONGEST-PATH: Given a graph G = (V, E), does there exists a simple path of length at least k edges? YES 2. SHORTEST-PATH: Given a graph G = (V, E), does there exists a simple
5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1
5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1 General Integer Linear Program: (ILP) min c T x Ax b x 0 integer Assumption: A, b integer The integrality condition
Topic: Greedy Approximations: Set Cover and Min Makespan Date: 1/30/06
CS880: Approximations Algorithms Scribe: Matt Elder Lecturer: Shuchi Chawla Topic: Greedy Approximations: Set Cover and Min Makespan Date: 1/30/06 3.1 Set Cover The Set Cover problem is: Given a set of
CMPSCI611: Approximating MAX-CUT Lecture 20
CMPSCI611: Approximating MAX-CUT Lecture 20 For the next two lectures we ll be seeing examples of approximation algorithms for interesting NP-hard problems. Today we consider MAX-CUT, which we proved to
Scheduling Shop Scheduling. Tim Nieberg
Scheduling Shop Scheduling Tim Nieberg Shop models: General Introduction Remark: Consider non preemptive problems with regular objectives Notation Shop Problems: m machines, n jobs 1,..., n operations
Fairness in Routing and Load Balancing
Fairness in Routing and Load Balancing Jon Kleinberg Yuval Rabani Éva Tardos Abstract We consider the issue of network routing subject to explicit fairness conditions. The optimization of fairness criteria
Linear Programming. Widget Factory Example. Linear Programming: Standard Form. Widget Factory Example: Continued.
Linear Programming Widget Factory Example Learning Goals. Introduce Linear Programming Problems. Widget Example, Graphical Solution. Basic Theory:, Vertices, Existence of Solutions. Equivalent formulations.
5.1 Bipartite Matching
CS787: Advanced Algorithms Lecture 5: Applications of Network Flow In the last lecture, we looked at the problem of finding the maximum flow in a graph, and how it can be efficiently solved using the Ford-Fulkerson
Why? A central concept in Computer Science. Algorithms are ubiquitous.
Analysis of Algorithms: A Brief Introduction Why? A central concept in Computer Science. Algorithms are ubiquitous. Using the Internet (sending email, transferring files, use of search engines, online
Near Optimal Solutions
Near Optimal Solutions Many important optimization problems are lacking efficient solutions. NP-Complete problems unlikely to have polynomial time solutions. Good heuristics important for such problems.
2.3 Convex Constrained Optimization Problems
42 CHAPTER 2. FUNDAMENTAL CONCEPTS IN CONVEX OPTIMIZATION Theorem 15 Let f : R n R and h : R R. Consider g(x) = h(f(x)) for all x R n. The function g is convex if either of the following two conditions
Complexity Theory. IE 661: Scheduling Theory Fall 2003 Satyaki Ghosh Dastidar
Complexity Theory IE 661: Scheduling Theory Fall 2003 Satyaki Ghosh Dastidar Outline Goals Computation of Problems Concepts and Definitions Complexity Classes and Problems Polynomial Time Reductions Examples
Computer Algorithms. NP-Complete Problems. CISC 4080 Yanjun Li
Computer Algorithms NP-Complete Problems NP-completeness The quest for efficient algorithms is about finding clever ways to bypass the process of exhaustive search, using clues from the input in order
Private Approximation of Clustering and Vertex Cover
Private Approximation of Clustering and Vertex Cover Amos Beimel, Renen Hallak, and Kobbi Nissim Department of Computer Science, Ben-Gurion University of the Negev Abstract. Private approximation of search
Introduction to Scheduling Theory
Introduction to Scheduling Theory Arnaud Legrand Laboratoire Informatique et Distribution IMAG CNRS, France [email protected] November 8, 2004 1/ 26 Outline 1 Task graphs from outer space 2 Scheduling
Constant Factor Approximation Algorithm for the Knapsack Median Problem
Constant Factor Approximation Algorithm for the Knapsack Median Problem Amit Kumar Abstract We give a constant factor approximation algorithm for the following generalization of the k-median problem. We
Duplicating and its Applications in Batch Scheduling
Duplicating and its Applications in Batch Scheduling Yuzhong Zhang 1 Chunsong Bai 1 Shouyang Wang 2 1 College of Operations Research and Management Sciences Qufu Normal University, Shandong 276826, China
Lecture 7: NP-Complete Problems
IAS/PCMI Summer Session 2000 Clay Mathematics Undergraduate Program Basic Course on Computational Complexity Lecture 7: NP-Complete Problems David Mix Barrington and Alexis Maciel July 25, 2000 1. Circuit
THE PROBLEM WORMS (1) WORMS (2) THE PROBLEM OF WORM PROPAGATION/PREVENTION THE MINIMUM VERTEX COVER PROBLEM
1 THE PROBLEM OF WORM PROPAGATION/PREVENTION I.E. THE MINIMUM VERTEX COVER PROBLEM Prof. Tiziana Calamoneri Network Algorithms A.y. 2014/15 2 THE PROBLEM WORMS (1)! A computer worm is a standalone malware
1 Introduction. Linear Programming. Questions. A general optimization problem is of the form: choose x to. max f(x) subject to x S. where.
Introduction Linear Programming Neil Laws TT 00 A general optimization problem is of the form: choose x to maximise f(x) subject to x S where x = (x,..., x n ) T, f : R n R is the objective function, S
Bicolored Shortest Paths in Graphs with Applications to Network Overlay Design
Bicolored Shortest Paths in Graphs with Applications to Network Overlay Design Hongsik Choi and Hyeong-Ah Choi Department of Electrical Engineering and Computer Science George Washington University Washington,
MapReduce and Distributed Data Analysis. Sergei Vassilvitskii Google Research
MapReduce and Distributed Data Analysis Google Research 1 Dealing With Massive Data 2 2 Dealing With Massive Data Polynomial Memory Sublinear RAM Sketches External Memory Property Testing 3 3 Dealing With
CSC 373: Algorithm Design and Analysis Lecture 16
CSC 373: Algorithm Design and Analysis Lecture 16 Allan Borodin February 25, 2013 Some materials are from Stephen Cook s IIT talk and Keven Wayne s slides. 1 / 17 Announcements and Outline Announcements
CSC2420 Spring 2015: Lecture 3
CSC2420 Spring 2015: Lecture 3 Allan Borodin January 22, 2015 1 / 1 Announcements and todays agenda Assignment 1 due next Thursday. I may add one or two additional questions today or tomorrow. Todays agenda
JUST-IN-TIME SCHEDULING WITH PERIODIC TIME SLOTS. Received December May 12, 2003; revised February 5, 2004
Scientiae Mathematicae Japonicae Online, Vol. 10, (2004), 431 437 431 JUST-IN-TIME SCHEDULING WITH PERIODIC TIME SLOTS Ondřej Čepeka and Shao Chin Sung b Received December May 12, 2003; revised February
OPRE 6201 : 2. Simplex Method
OPRE 6201 : 2. Simplex Method 1 The Graphical Method: An Example Consider the following linear program: Max 4x 1 +3x 2 Subject to: 2x 1 +3x 2 6 (1) 3x 1 +2x 2 3 (2) 2x 2 5 (3) 2x 1 +x 2 4 (4) x 1, x 2
Discuss the size of the instance for the minimum spanning tree problem.
3.1 Algorithm complexity The algorithms A, B are given. The former has complexity O(n 2 ), the latter O(2 n ), where n is the size of the instance. Let n A 0 be the size of the largest instance that can
Efficient and Robust Allocation Algorithms in Clouds under Memory Constraints
Efficient and Robust Allocation Algorithms in Clouds under Memory Constraints Olivier Beaumont,, Paul Renaud-Goud Inria & University of Bordeaux Bordeaux, France 9th Scheduling for Large Scale Systems
Definition 11.1. Given a graph G on n vertices, we define the following quantities:
Lecture 11 The Lovász ϑ Function 11.1 Perfect graphs We begin with some background on perfect graphs. graphs. First, we define some quantities on Definition 11.1. Given a graph G on n vertices, we define
Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams
Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams André Ciré University of Toronto John Hooker Carnegie Mellon University INFORMS 2014 Home Health Care Home health care delivery
Page 1. CSCE 310J Data Structures & Algorithms. CSCE 310J Data Structures & Algorithms. P, NP, and NP-Complete. Polynomial-Time Algorithms
CSCE 310J Data Structures & Algorithms P, NP, and NP-Complete Dr. Steve Goddard [email protected] CSCE 310J Data Structures & Algorithms Giving credit where credit is due:» Most of the lecture notes
Lecture 6 Online and streaming algorithms for clustering
CSE 291: Unsupervised learning Spring 2008 Lecture 6 Online and streaming algorithms for clustering 6.1 On-line k-clustering To the extent that clustering takes place in the brain, it happens in an on-line
The Goldberg Rao Algorithm for the Maximum Flow Problem
The Goldberg Rao Algorithm for the Maximum Flow Problem COS 528 class notes October 18, 2006 Scribe: Dávid Papp Main idea: use of the blocking flow paradigm to achieve essentially O(min{m 2/3, n 1/2 }
1 Approximating Set Cover
CS 05: Algorithms (Grad) Feb 2-24, 2005 Approximating Set Cover. Definition An Instance (X, F ) of the set-covering problem consists of a finite set X and a family F of subset of X, such that every elemennt
Distributed Computing over Communication Networks: Maximal Independent Set
Distributed Computing over Communication Networks: Maximal Independent Set What is a MIS? MIS An independent set (IS) of an undirected graph is a subset U of nodes such that no two nodes in U are adjacent.
Small Maximal Independent Sets and Faster Exact Graph Coloring
Small Maximal Independent Sets and Faster Exact Graph Coloring David Eppstein Univ. of California, Irvine Dept. of Information and Computer Science The Exact Graph Coloring Problem: Given an undirected
Lecture 4 Online and streaming algorithms for clustering
CSE 291: Geometric algorithms Spring 2013 Lecture 4 Online and streaming algorithms for clustering 4.1 On-line k-clustering To the extent that clustering takes place in the brain, it happens in an on-line
Algorithm Design for Performance Aware VM Consolidation
Algorithm Design for Performance Aware VM Consolidation Alan Roytman University of California, Los Angeles Sriram Govindan Microsoft Corporation Jie Liu Microsoft Research Aman Kansal Microsoft Research
Approximating Minimum Bounded Degree Spanning Trees to within One of Optimal
Approximating Minimum Bounded Degree Spanning Trees to within One of Optimal ABSTACT Mohit Singh Tepper School of Business Carnegie Mellon University Pittsburgh, PA USA [email protected] In the MINIMUM
How To Solve A Minimum Set Covering Problem (Mcp)
Measuring Rationality with the Minimum Cost of Revealed Preference Violations Mark Dean and Daniel Martin Online Appendices - Not for Publication 1 1 Algorithm for Solving the MASP In this online appendix
Can linear programs solve NP-hard problems?
Can linear programs solve NP-hard problems? p. 1/9 Can linear programs solve NP-hard problems? Ronald de Wolf Linear programs Can linear programs solve NP-hard problems? p. 2/9 Can linear programs solve
Ph.D. Thesis. Judit Nagy-György. Supervisor: Péter Hajnal Associate Professor
Online algorithms for combinatorial problems Ph.D. Thesis by Judit Nagy-György Supervisor: Péter Hajnal Associate Professor Doctoral School in Mathematics and Computer Science University of Szeged Bolyai
Chapter 13: Binary and Mixed-Integer Programming
Chapter 3: Binary and Mixed-Integer Programming The general branch and bound approach described in the previous chapter can be customized for special situations. This chapter addresses two special situations:
THE SCHEDULING OF MAINTENANCE SERVICE
THE SCHEDULING OF MAINTENANCE SERVICE Shoshana Anily Celia A. Glass Refael Hassin Abstract We study a discrete problem of scheduling activities of several types under the constraint that at most a single
Introduction to computer science
Introduction to computer science Michael A. Nielsen University of Queensland Goals: 1. Introduce the notion of the computational complexity of a problem, and define the major computational complexity classes.
Single machine parallel batch scheduling with unbounded capacity
Workshop on Combinatorics and Graph Theory 21th, April, 2006 Nankai University Single machine parallel batch scheduling with unbounded capacity Yuan Jinjiang Department of mathematics, Zhengzhou University
CHAPTER 9. Integer Programming
CHAPTER 9 Integer Programming An integer linear program (ILP) is, by definition, a linear program with the additional constraint that all variables take integer values: (9.1) max c T x s t Ax b and x integral
24. The Branch and Bound Method
24. The Branch and Bound Method It has serious practical consequences if it is known that a combinatorial problem is NP-complete. Then one can conclude according to the present state of science that no
Online and Offline Selling in Limit Order Markets
Online and Offline Selling in Limit Order Markets Kevin L. Chang 1 and Aaron Johnson 2 1 Yahoo Inc. [email protected] 2 Yale University [email protected] Abstract. Completely automated electronic
! X is a set of strings. ! Instance: string s. ! Algorithm A solves problem X: A(s) = yes iff s! X.
Decision Problems 8.2 Definition of NP Decision problem. X is a set of strings. Instance: string s. Algorithm A solves problem X: A(s) = yes iff s X. Polynomial time. Algorithm A runs in polytime if for
Introduction to Algorithms. Part 3: P, NP Hard Problems
Introduction to Algorithms Part 3: P, NP Hard Problems 1) Polynomial Time: P and NP 2) NP-Completeness 3) Dealing with Hard Problems 4) Lower Bounds 5) Books c Wayne Goddard, Clemson University, 2004 Chapter
Transportation Polytopes: a Twenty year Update
Transportation Polytopes: a Twenty year Update Jesús Antonio De Loera University of California, Davis Based on various papers joint with R. Hemmecke, E.Kim, F. Liu, U. Rothblum, F. Santos, S. Onn, R. Yoshida,
The Container Selection Problem
The Container Selection Problem Viswanath Nagarajan 1, Kanthi K. Sarpatwar 2, Baruch Schieber 2, Hadas Shachnai 3, and Joel L. Wolf 2 1 University of Michigan, Ann Arbor, MI, USA [email protected] 2 IBM
Outline. NP-completeness. When is a problem easy? When is a problem hard? Today. Euler Circuits
Outline NP-completeness Examples of Easy vs. Hard problems Euler circuit vs. Hamiltonian circuit Shortest Path vs. Longest Path 2-pairs sum vs. general Subset Sum Reducing one problem to another Clique
Proximal mapping via network optimization
L. Vandenberghe EE236C (Spring 23-4) Proximal mapping via network optimization minimum cut and maximum flow problems parametric minimum cut problem application to proximal mapping Introduction this lecture:
On the independence number of graphs with maximum degree 3
On the independence number of graphs with maximum degree 3 Iyad A. Kanj Fenghui Zhang Abstract Let G be an undirected graph with maximum degree at most 3 such that G does not contain any of the three graphs
CSC2420 Fall 2012: Algorithm Design, Analysis and Theory
CSC2420 Fall 2012: Algorithm Design, Analysis and Theory Allan Borodin November 15, 2012; Lecture 10 1 / 27 Randomized online bipartite matching and the adwords problem. We briefly return to online algorithms
4.6 Linear Programming duality
4.6 Linear Programming duality To any minimization (maximization) LP we can associate a closely related maximization (minimization) LP. Different spaces and objective functions but in general same optimal
2. (a) Explain the strassen s matrix multiplication. (b) Write deletion algorithm, of Binary search tree. [8+8]
Code No: R05220502 Set No. 1 1. (a) Describe the performance analysis in detail. (b) Show that f 1 (n)+f 2 (n) = 0(max(g 1 (n), g 2 (n)) where f 1 (n) = 0(g 1 (n)) and f 2 (n) = 0(g 2 (n)). [8+8] 2. (a)
School Timetabling in Theory and Practice
School Timetabling in Theory and Practice Irving van Heuven van Staereling VU University, Amsterdam Faculty of Sciences December 24, 2012 Preface At almost every secondary school and university, some
Linear Programming. March 14, 2014
Linear Programming March 1, 01 Parts of this introduction to linear programming were adapted from Chapter 9 of Introduction to Algorithms, Second Edition, by Cormen, Leiserson, Rivest and Stein [1]. 1
Approximated Distributed Minimum Vertex Cover Algorithms for Bounded Degree Graphs
Approximated Distributed Minimum Vertex Cover Algorithms for Bounded Degree Graphs Yong Zhang 1.2, Francis Y.L. Chin 2, and Hing-Fung Ting 2 1 College of Mathematics and Computer Science, Hebei University,
A Working Knowledge of Computational Complexity for an Optimizer
A Working Knowledge of Computational Complexity for an Optimizer ORF 363/COS 323 Instructor: Amir Ali Ahmadi TAs: Y. Chen, G. Hall, J. Ye Fall 2014 1 Why computational complexity? What is computational
SIMS 255 Foundations of Software Design. Complexity and NP-completeness
SIMS 255 Foundations of Software Design Complexity and NP-completeness Matt Welsh November 29, 2001 [email protected] 1 Outline Complexity of algorithms Space and time complexity ``Big O'' notation Complexity
Notes on NP Completeness
Notes on NP Completeness Rich Schwartz November 10, 2013 1 Overview Here are some notes which I wrote to try to understand what NP completeness means. Most of these notes are taken from Appendix B in Douglas
Nan Kong, Andrew J. Schaefer. Department of Industrial Engineering, Univeristy of Pittsburgh, PA 15261, USA
A Factor 1 2 Approximation Algorithm for Two-Stage Stochastic Matching Problems Nan Kong, Andrew J. Schaefer Department of Industrial Engineering, Univeristy of Pittsburgh, PA 15261, USA Abstract We introduce
The Generalized Assignment Problem with Minimum Quantities
The Generalized Assignment Problem with Minimum Quantities Sven O. Krumke a, Clemens Thielen a, a University of Kaiserslautern, Department of Mathematics Paul-Ehrlich-Str. 14, D-67663 Kaiserslautern, Germany
Problem Set 7 Solutions
8 8 Introduction to Algorithms May 7, 2004 Massachusetts Institute of Technology 6.046J/18.410J Professors Erik Demaine and Shafi Goldwasser Handout 25 Problem Set 7 Solutions This problem set is due in
The Load-Distance Balancing Problem
The Load-Distance Balancing Problem Edward Bortnikov Samir Khuller Yishay Mansour Joseph (Seffi) Naor Yahoo! Research, Matam Park, Haifa 31905 (Israel) Department of Computer Science, University of Maryland,
Single-Link Failure Detection in All-Optical Networks Using Monitoring Cycles and Paths
Single-Link Failure Detection in All-Optical Networks Using Monitoring Cycles and Paths Satyajeet S. Ahuja, Srinivasan Ramasubramanian, and Marwan Krunz Department of ECE, University of Arizona, Tucson,
Collinear Points in Permutations
Collinear Points in Permutations Joshua N. Cooper Courant Institute of Mathematics New York University, New York, NY József Solymosi Department of Mathematics University of British Columbia, Vancouver,
An Approximation Algorithm for Bounded Degree Deletion
An Approximation Algorithm for Bounded Degree Deletion Tomáš Ebenlendr Petr Kolman Jiří Sgall Abstract Bounded Degree Deletion is the following generalization of Vertex Cover. Given an undirected graph
Load balancing of temporary tasks in the l p norm
Load balancing of temporary tasks in the l p norm Yossi Azar a,1, Amir Epstein a,2, Leah Epstein b,3 a School of Computer Science, Tel Aviv University, Tel Aviv, Israel. b School of Computer Science, The
Classification - Examples
Lecture 2 Scheduling 1 Classification - Examples 1 r j C max given: n jobs with processing times p 1,...,p n and release dates r 1,...,r n jobs have to be scheduled without preemption on one machine taking
ONLINE DEGREE-BOUNDED STEINER NETWORK DESIGN. Sina Dehghani Saeed Seddighin Ali Shafahi Fall 2015
ONLINE DEGREE-BOUNDED STEINER NETWORK DESIGN Sina Dehghani Saeed Seddighin Ali Shafahi Fall 2015 ONLINE STEINER FOREST PROBLEM An initially given graph G. s 1 s 2 A sequence of demands (s i, t i ) arriving
GRAPH THEORY LECTURE 4: TREES
GRAPH THEORY LECTURE 4: TREES Abstract. 3.1 presents some standard characterizations and properties of trees. 3.2 presents several different types of trees. 3.7 develops a counting method based on a bijection
CMSC 858T: Randomized Algorithms Spring 2003 Handout 8: The Local Lemma
CMSC 858T: Randomized Algorithms Spring 2003 Handout 8: The Local Lemma Please Note: The references at the end are given for extra reading if you are interested in exploring these ideas further. You are
Modeling and Solving the Capacitated Vehicle Routing Problem on Trees
in The Vehicle Routing Problem: Latest Advances and New Challenges Modeling and Solving the Capacitated Vehicle Routing Problem on Trees Bala Chandran 1 and S. Raghavan 2 1 Department of Industrial Engineering
Lecture 3. Linear Programming. 3B1B Optimization Michaelmas 2015 A. Zisserman. Extreme solutions. Simplex method. Interior point method
Lecture 3 3B1B Optimization Michaelmas 2015 A. Zisserman Linear Programming Extreme solutions Simplex method Interior point method Integer programming and relaxation The Optimization Tree Linear Programming
Practical Guide to the Simplex Method of Linear Programming
Practical Guide to the Simplex Method of Linear Programming Marcel Oliver Revised: April, 0 The basic steps of the simplex algorithm Step : Write the linear programming problem in standard form Linear
Offline sorting buffers on Line
Offline sorting buffers on Line Rohit Khandekar 1 and Vinayaka Pandit 2 1 University of Waterloo, ON, Canada. email: [email protected] 2 IBM India Research Lab, New Delhi. email: [email protected]
Triangle deletion. Ernie Croot. February 3, 2010
Triangle deletion Ernie Croot February 3, 2010 1 Introduction The purpose of this note is to give an intuitive outline of the triangle deletion theorem of Ruzsa and Szemerédi, which says that if G = (V,
Load Balancing. Load Balancing 1 / 24
Load Balancing Backtracking, branch & bound and alpha-beta pruning: how to assign work to idle processes without much communication? Additionally for alpha-beta pruning: implementing the young-brothers-wait
Solutions to Homework 6
Solutions to Homework 6 Debasish Das EECS Department, Northwestern University [email protected] 1 Problem 5.24 We want to find light spanning trees with certain special properties. Given is one example
