11. APPROXIMATION ALGORITHMS


 Jared Woods
 10 months ago
 Views:
Transcription
1 Coping with NPcompleteness 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Q. Suppose I need to solve an NPhard problem. What should I do? A. Sacrifice one of three desired features. i. Solve arbitrary instances of the problem. ii. Solve problem to optimality. iii. Solve problem in polynomial time. ρapproximation algorithm. Guaranteed to run in polytime. Guaranteed to solve arbitrary instance of the problem Guaranteed to find solution within ratio ρ of true optimum. Lecture slides by Kevin Wayne Copyright 005 PearsonAddison Wesley Challenge. Need to prove a solution's value is close to optimum, without even knowing what optimum value is Last updated on /15/17 6:00 PM Load balancing 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Input. m identical machines; n jobs, job j has processing time t j. Job j must run contiguously on one machine. A machine can process at most one job at a time. Def. Let J(i) be the subset of jobs assigned to machine i. The load of machine i is L i = Σ j J(i) t j. Def. The makespan is the maximum load on any machine L = max i L i. Load balancing. Assign each job to a machine to minimize makespan. machine 1 a dmachine 1 f machine b c Machine e g 0 L1 L time
2 Load balancing on machines is NPhard Load balancing: list scheduling Claim. Load balancing is hard even if only machines. Pf. PARTITION P LOADBALANCE. NPcomplete by Exercise 8.6 Listscheduling algorithm. Consider n jobs in some fixed order. Assign job j to machine whose load is smallest so far. ListScheduling(m, n, t 1,t,,t n ) { e a b f c g d for i = 1 to m { L i 0 } J(i) load on machine i jobs assigned to machine i length of job f for j = 1 to n { i = argmin k L k J(i) J(i) {j} machine i has smallest load assign job j to machine i L i L i + t j update load of machine i machine 1 machine a dmachine 1 f b c Machine e g yes } } return J(1),, J(m) Implementation. O(n log m) using a priority queue. 0 L time 5 6 Load balancing: list scheduling analysis Believe it or not Theorem. [Graham 1966] Greedy algorithm is a approximation. First worstcase analysis of an approximation algorithm. Need to compare resulting solution with optimal makespan L*. Lemma 1. The optimal makespan L* max j t j. Pf. Some machine must process the most timeconsuming job. Lemma. The optimal makespan Pf. The total processing time is Σ j t j. L * m 1 j t j. One of m machines must do at least a 1 / m fraction of total work. L * m 1 j t j. 7 8
3 Load balancing: list scheduling analysis Load balancing: list scheduling analysis Theorem. Greedy algorithm is a approximation. Pf. Consider load L i of bottleneck machine i. Let j be last job scheduled on machine i. When job j assigned to machine i, i had smallest load. Its load before assignment is L i t j L i t j L k for all 1 k m. Theorem. Greedy algorithm is a approximation. Pf. Consider load L i of bottleneck machine i. Let j be last job scheduled on machine i. When job j assigned to machine i, i had smallest load. Its load before assignment is L i t j L i t j L k for all 1 k m. Sum inequalities over all k and divide by m: blue jobs scheduled before j L i t j 1 m k L k = 1 m k t k L * Lemma machine i j Now, L = L i = (L i t j )!# "# $ + t j % L* L* L *. Lemma 1 0 Li  tj L = Li time 9 10 Load balancing: list scheduling analysis Load balancing: list scheduling analysis Q. Is our analysis tight? A. Essentially yes. Q. Is our analysis tight? A. Essentially yes. Ex: m machines, m (m 1) jobs length 1 jobs, one job of length m. Ex: m machines, m (m 1) jobs length 1 jobs, one job of length m. list scheduling makespan = 19 = m  1 optimal makespan = 10 = m m = 10 machine idle machine 3 idle machine idle machine 5 idle machine 6 idle machine 7 idle machine 8 idle machine 9 idle machine 10 idle m =
4 Load balancing: LPT rule Load balancing: LPT rule Longest processing time (LPT). Sort n jobs in descending order of processing time, and then run list scheduling algorithm. Observation. If at most m jobs, then listscheduling is optimal. Pf. Each job put on its own machine. Lemma 3. If there are more than m jobs, L* t m+1. LPTListScheduling(m, n, t 1,t,,t n ) { Sort jobs so that t 1 t for i = 1 to m { } L i 0 J(i) t n load on machine i jobs assigned to machine i Pf. Consider first m+1 jobs t 1,, t m+1. Since the t i 's are in descending order, each takes at least t m+1 time. There are m + 1 jobs and m machines, so by pigeonhole principle, at least one machine gets two jobs. for j = 1 to n { i = argmin k L k machine i has smallest load Theorem. LPT rule is a 3/approximation algorithm. Pf. Same basic approach as for list scheduling. } J(i) J(i) {j} L i L i + t j } return J(1),, J(m) assign job j to machine i update load of machine i 13 L i = (L i t j )!# "# $ + t j % 3 L *. L* 1 L* Lemma 3 ( by observation, can assume number of jobs > m ) 1 Load Balancing: LPT rule Q. Is our 3/ analysis tight? A. No. Theorem. [Graham 1969] LPT rule is a /3approximation. Pf. More sophisticated analysis of same algorithm. Q. Is Graham's /3 analysis tight? A. Essentially yes. 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Ex. m machines n = m + 1 jobs jobs of length m, m + 1,, m 1 and one more job of length m. Then, L / L* = (m 1) / (3m) 15
5 Center selection problem Center selection problem Input. Set of n sites s 1,, s n and an integer k > 0. Input. Set of n sites s 1,, s n and an integer k > 0. Center selection problem. Select set of k centers C so that maximum distance r(c) from a site to nearest center is minimized. Center selection problem. Select set of k centers C so that maximum distance r(c) from a site to nearest center is minimized. r(c) k = centers Notation. dist(x, y) = distance between sites x and y. dist(s i, C) = min c C dist(s i, c) = distance from s i to closest center. r(c) = max i dist(s i, C) = smallest covering radius. Goal. Find set of centers C that minimizes r(c), subject to C = k. center site Distance function properties. dist(x, x) = 0 [ identity ] dist(x, y) = dist(y, x) [ symmetry ] dist(x, y) dist(x, z) + dist(z, y) [ triangle inequality ] Center selection example Greedy algorithm: a false start Ex: each site is a point in the plane, a center can be any point in the plane, dist(x, y) = Euclidean distance. Remark: search can be infinite! Greedy algorithm. Put the first center at the best possible location for a single center, and then keep adding centers so as to reduce the covering radius each time by as much as possible. Remark: arbitrarily bad! k = centers k = centers r(c) greedy center 1 center site center site 19 0
6 Center selection: greedy algorithm Center selection: analysis of greedy algorithm Repeatedly choose next center to be site farthest from any existing center. GREEDYCENTERSELECTION (k, n, s 1, s,, s n) C. REPEAT k times Select a site s i with maximum distance dist(s i, C). C C s i. RETURN C. site farthest from any center Lemma. Let C* be an optimal set of centers. Then r(c) r(c*). Pf. [by contradiction] Assume r(c*) < ½ r(c). For each site c i C, consider ball of radius ½ r(c) around it. Exactly one c * i in each ball; let c i be the site paired with c i *. Consider any site s and its closest center c * i C*. dist(s, C) dist(s, c i ) dist(s, c i *) + dist(c i *, c i ) r(c*). Thus, r(c) r(c*). r(c*) since c i * is closest center Δinequality ½ r(c) ½ r(c) ½ r(c) c i Property. Upon termination, all centers in C are pairwise at least r(c) apart. Pf. By construction of algorithm. C* site s c i * 1 Center selection Dominating set reduces to center selection Lemma. Let C* be an optimal set of centers. Then r(c) r (C*). Theorem. Greedy algorithm is a approximation for center selection problem. Remark. Greedy algorithm always places centers at sites, but is still within a factor of of best solution that is allowed to place centers anywhere. Question. Is there hope of a 3/approximation? /3? e.g., points in the plane Theorem. Unless P = NP, there no ρapproximation for center selection problem for any ρ <. Pf. We show how we could use a ( ε) approximation algorithm for CENTERSELECTION selection to solve DOMINATINGSET in polytime. Let G = (V, E), k be an instance of DOMINATINGSET. Construct instance G' of CENTERSELECTION with sites V and distances dist(u, v) = 1 if (u, v) E dist(u, v) = if (u, v) E Note that G' satisfies the triangle inequality. G has dominating set of size k iff there exists k centers C* with r(c*) = 1. Thus, if G has a dominating set of size k, a ( ε)approximation algorithm for CENTERSELECTION would find a solution C* with r(c*) = 1 since it cannot use any edge of distance. 3
7 Weighted vertex cover 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Definition. Given a graph G = (V, E), a vertex cover is a set S V such that each edge in E has at least one end in S. Weighted vertex cover. Given a graph G with vertex weights, find a vertex cover of minimum weight. 9 9 weight = + + weight = 11 6 Pricing method Pricing method Pricing method. Each edge must be covered by some vertex. Edge e = (i, j) pays price p e 0 to use both vertex i and j. Set prices and find vertex cover simultaneously. Fairness. Edges incident to vertex i should pay w i in total. WEIGHTEDVERTEXCOVER (G, w) for each vertex i : p e e=(i, j) w i S. FOREACH e E p e 0. 9 Fairness lemma. For any vertex cover S and any fair prices pe : e pe w(s). Pf. WHILE (there exists an edge (i, j) such that neither i nor j is tight) Select such an edge e = (i, j). Increase pe as much as possible until i or j tight. S set of all tight nodes. RETURN S. each edge e covered by at least one node in S sum fairness inequalities for each node in S 7 8
8 Pricing method example Pricing method: analysis Theorem. Pricing method is a approximation for WEIGHTEDVERTEXCOVER. Pf. Algorithm terminates since at least one new node becomes tight after each iteration of while loop. Let S = set of all tight nodes upon termination of algorithm. S is a vertex cover: if some edge (i, j) is uncovered, then neither i nor j is tight. But then while loop would not terminate. vertex weight price of edge ab Let S* be optimal vertex cover. We show w(s) w(s*). w(s) = w i = i S i S p e e=(i, j) i V p e e=(i, j) = p e w(s*). e E all nodes in S are tight S V, prices 0 each edge counted twice fairness lemma 9 30 Weighted vertex cover 11. APPROXIMATION ALGORITHMS Given a graph G = (V, E) with vertex weights w i 0, find a min weight subset of vertices S V such that every edge is incident to at least one vertex in S. load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem total weight = = 57 3
9 Weighted vertex cover: ILP formulation Weighted vertex cover: ILP formulation Given a graph G = (V, E) with vertex weights w i 0, find a min weight subset of vertices S V such that every edge is incident to at least one vertex in S. Integer linear programming formulation. Model inclusion of each vertex i using a 0/1 variable x i. x i = " # $ Vertex covers in 1 1 correspondence with 0/1 assignments: S = { i V : x i = 1}. 0 if vertex i is not in vertex cover 1 if vertex i is in vertex cover Objective function: minimize Σ i w i x i. Weighted vertex cover. Integer linear programming formulation. (ILP) i V w i x i x i + x j 1 (i, j) E x i {0, 1} i V Observation. If x* is optimal solution to (ILP), then S = { i V : x i * = 1} is a min weight vertex cover. For every edge (i, j), must take either vertex i or j (or both): x i + x j Integer linear programming Linear programming Given integers a ij, b i, and c j, find integers x j that satisfy: c T x Ax b x 0 x Observation. Vertex cover formulation proves that INTEGERPROGRAMMING is an NPhard search problem. n j=1 n j=1 c j x j a ij x j b i 1 i m x j 0 1 j n x j 1 j n Given integers a ij, b i, and c j, find real numbers x j that satisfy: c T x Ax b x 0 Linear. No x, xy, arccos(x), x(1 x), etc. Simplex algorithm. [Dantzig 197] Can solve LP in practice. Ellipsoid algorithm. [Khachian 1979] Can solve LP in polytime. Interior point algorithms. [Karmarkar 198, Renegar 1988, ] Can solve LP in polytime and in practice. n j=1 n j=1 c j x j a ij x j b i 1 i m x j 0 1 j n 35 36
10 LP feasible region Weighted vertex cover: LP relaxation LP geometry in D. Linear programming relaxation. x 1 = 0 (LP ) i V w i x i x i + x j 1 (i, j) E x i 0 i V Observation. Optimal value of (LP) is optimal value of (ILP). Pf. LP has fewer constraints. Note. LP is not equivalent to vertex cover. ½ ½ x 1 + x = 6 x = 0 x 1 + x = 6 Q. How can solving LP help us find a small vertex cover? A. Solve LP and round fractional values. ½ Weighted vertex cover: LP rounding algorithm Weighted vertex cover inapproximability Lemma. If x* is optimal solution to (LP), then S = { i V : x i * ½} is a vertex cover whose weight is at most twice the min possible weight. Theorem. [Dinur Safra 00] If P NP, then no ρapproximation for WEIGHTEDVERTEXCOVER for any ρ < (even if all weights are 1). Pf. [S is a vertex cover] Consider an edge (i, j) E. Since x i * + x j * 1, either x i * ½ or x j * ½ (or both) (i, j) covered. Pf. [S has desired cost] Let S* be optimal vertex cover. Then w i i S* * w i x i 1 w i i S LP is a relaxation xi* ½ i S Theorem. The rounding algorithm is a approximation algorithm. Pf. Lemma + fact that LP can be solved in polytime. On the Hardness of Approximating Minimum Vertex Cover Irit Dinur Samuel Safra May 6, 00 Abstract We prove the Minimum Vertex Cover problem to be NPhard to approximate to within a factor of , extending on previous PCP and hardness of approximation technique. To that end, one needs to develop a new proof framework, and borrow and extend ideas from several fields. Open research problem. Close the gap. 39 0
11 Generalized load balancing 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Input. Set of m machines M; set of n jobs J. Job j J must run contiguously on an authorized machine in M j M. Job j J has processing time t j. Each machine can process at most one job at a time. Def. Let J(i) be the subset of jobs assigned to machine i. The load of machine i is L i = Σ j J(i) t j. Def. The makespan is the maximum load on any machine = max i L i. Generalized load balancing. Assign each job to an authorized machine to minimize makespan. Generalized load balancing: integer linear program and relaxation Generalized load balancing: lower bounds ILP formulation. x ij = time machine i spends processing job j. (IP) min L s. t. x i j = t j for all j J i L for all i M x i j j x i j {0, t j } for all j J and i M j Lemma 1. The optimal makespan L* max j t j. Pf. Some machine must process the most timeconsuming job. Lemma. Let L be optimal value to the LP. Then, optimal makespan L* L. Pf. LP has fewer constraints than IP formulation. x i j = 0 for all j J and i M j LP relaxation. (LP) min L s. t. x i j = t j for all j J i L for all i M x i j j x i j 0 for all j J and i M j x i j = 0 for all j J and i M j 3
12 Generalized load balancing: structure of LP solution Generalized load balancing: rounding Lemma 3. Let x be solution to LP. Let G(x) be the graph with an edge between machine i and job j if x ij > 0. Then G(x) is acyclic. Pf. (deferred) can transform x into another LP solution where G(x) is acyclic if LP solver doesn't return such an x Rounded solution. Find LP solution x where G(x) is a forest. Root forest G(x) at some arbitrary machine node r. If job j is a leaf node, assign j to its parent machine i. If job j is not a leaf node, assign j to any one of its children. Lemma. Rounded solution only assigns jobs to authorized machines. Pf. If job j is assigned to machine i, then x ij > 0. LP solution can only assign positive value to authorized machines. x ij > 0 G(x) acyclic job G(x) cyclic job machine machine 5 6 Generalized load balancing: analysis Generalized load balancing: analysis Lemma 5. If job j is a leaf node and machine i = parent(j), then x ij = t j. Pf. Since i is a leaf, x ij = 0 for all j parent(i). LP constraint guarantees Σ i x ij = t j. Theorem. Rounded solution is a approximation. Pf. Let J(i) be the jobs assigned to machine i. By LEMMA 6, the load L i on machine i has two components: Lemma 6. At most one nonleaf job is assigned to a machine. Pf. The only possible nonleaf job assigned to machine i is parent(i). leaf nodes: t j j J(i) j is a leaf = x ij x ij L L * Lemma 1 Lemma 5 j J(i) j is a leaf j J LP Lemma (LP is a relaxation) optimal value of LP parent: t parent(i) L * job machine Thus, the overall load L i L*. 7 8
13 Generalized load balancing: flow formulation Generalized load balancing: structure of solution Flow formulation of LP. x i j i x i j j = t j for all j J L for all i M x i j 0 for all j J and i M j x i j = 0 for all j J and i M j Lemma 3. Let (x, L) be solution to LP. Let G(x) be the graph with an edge from machine i to job j if x ij > 0. We can find another solution (x', L) such that G(x') is acyclic. Pf. Let C be a cycle in G(x). Augment flow along the cycle C. flow conservation maintained At least one edge from C is removed (and none are added). Repeat until G(x') is acyclic. 3 3 augment flow along cycle C Observation. Solution to feasible flow problem with value L are in 1to1 correspondence with LP solutions of value L. 1 3 G(x) 5 1 G(x') Conclusions Running time. The bottleneck operation in our approximation is solving one LP with m n + 1 variables. Remark. Can solve LP using flow techniques on a graph with m + n + 1 nodes: given L, find feasible flow if it exists. Binary search to find L*. Extensions: unrelated parallel machines. [Lenstra Shmoys Tardos 1990] Job j takes t ij time if processed on machine i. approximation algorithm via LP rounding. If P NP, then no no ρapproximation exists for any ρ < 3/. 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Mathematical Programming 6 (1990) NorthHolland APPROXIMATION ALGORITHMS FOR SCHEDULING UNRELATED PARALLEL MACHINES Jan Karel LENSTRA Eindhoven University of Technology, Eindhoven, The Netherlands, and Centre for Mathematics and Computer Science, Amsterdam, The Netherlands David B. SHMOYS and l~va TARDOS Cornell University, Ithaca, NY, USA 51
14 Polynomialtime approximation scheme Knapsack problem PTAS. (1 + ε)approximation algorithm for any constant ε > 0. Load balancing. [Hochbaum Shmoys 1987] Euclidean TSP. [Arora, Mitchell 1996] Consequence. PTAS produces arbitrarily high quality solution, but trades off accuracy for time. Knapsack problem. Given n objects and a knapsack. Item i has value v i > 0 and weighs w i > 0. Knapsack has weight limit W. Goal: fill knapsack so as to maximize total value. Ex: { 3, } has value 0. we assume wi W for each i This section. PTAS for knapsack problem via rounding and scaling. item value weight original instance (W = 11) 53 5 Knapsack is NPcomplete Knapsack problem: dynamic programming I KNAPSACK. Given a set X, weights w i 0, values v i 0, a weight limit W, and a target value V, is there a subset S X such that: SUBSETSUM. Given a set X, values u i 0, and an integer U, is there a subset S X whose elements sum to exactly U? Theorem. SUBSETSUM P KNAPSACK. W w i i S v i i S V Pf. Given instance (u1,, un, U) of SUBSETSUM, create KNAPSACK instance: v i = w i = u i V = W = U u i i S u i i S U U Def. OPT(i, w) = max value subset of items 1,..., i with weight limit w. Case 1. OPT does not select item i. OPT selects best of 1,, i 1 using up to weight limit w. Case. OPT selects item i. New weight limit = w w i. OPT selects best of 1,, i 1 using up to weight limit w w i. # 0 if i = 0 % OPT(i, w) = $ OPT(i 1, w) if w i > w % & max{ OPT(i 1, w), v i + OPT(i 1, w w i )} otherwise Theorem. Computes the optimal value in O(n W) time. Not polynomial in input size. Polynomial in input size if weights are small integers
15 Knapsack problem: dynamic programming II Knapsack problem: dynamic programming II Def. OPT(i, v) = min weight of a knapsack for which we can obtain a solution of value v using a subset of items 1,..., i. Note. Optimal value is the largest value v such that OPT(n, v) W. Case 1. OPT does not select item i. OPT selects best of 1,, i 1 that achieves value v. Case. OPT selects item i. Consumes weight w i, need to achieve value v v i. OPT selects best of 1,, i 1 that achieves value v v i. Theorem. Dynamic programming algorithm II computes the optimal value in O(n vmax) time, where vmax is the maximum of any value. Pf. The optimal value V* n v max. There is one subproblem for each item and for each value v V*. It takes O(1) time per subproblem. Remark 1. Not polynomial in input size! Remark. Polynomial time if values are small integers. OPT(i, v) = 0 v 0 i =0 v>0 min {OPT(i 1,v), w i + OPT(i 1,v v i )} Knapsack problem: polynomialtime approximation scheme Knapsack problem: polynomialtime approximation scheme Intuition for approximation algorithm. Round all values up to lie in smaller range. Run dynamic programming algorithm II on rounded/scaled instance. Return optimal items in rounded instance. item value weight item value weight Round up all values: 0 < ε 1 = precision parameter. v max = largest value in original instance. θ = scaling factor = ε v max / n. v i = Observation. Optimal solutions to problem with v are equivalent to optimal solutions to problem with v ˆ. Intuition. v close to v so optimal solution using v is nearly optimal; v ˆ small and integral so dynamic programming algorithm II is fast. v i, ˆv i = v i original instance (W = 11) rounded instance (W = 11) 59 60
16 Knapsack problem: polynomialtime approximation scheme Knapsack problem: polynomialtime approximation scheme Theorem. If S is solution found by rounding algorithm and S* is any other feasible solution, then Pf. Let S* be any feasible solution satisfying weight constraint. v i v i (1 + ) always round up v i v i subset containing only the item of largest value Theorem. For any ε > 0, the rounding algorithm computes a feasible solution whose value is within a (1 + ε) factor of the optimum in O(n 3 / ε) time. Pf. We have already proved the accuracy bound. Dynamic program II running time is O(n v ˆ max ), where i i S S v i (v i + ) solve rounded instance optimally never round up by more than θ choosing S* = { max } v v i + 1 v ˆv max = v max = n = v i + n S n v i + 1 v θ = ε v max / n v i + 1 v thus v v i (1 + ) i S v i v max Σ i S v i 61 6
Chapter 11. Approximation Algorithms. Slides by Kevin Wayne PearsonAddison Wesley. All rights reserved.
Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 PearsonAddison Wesley. All rights reserved. 1 Approximation Algorithms Q. Suppose I need to solve an NPhard problem. What should
More information! Solve problem to optimality. ! Solve problem in polytime. ! Solve arbitrary instances of the problem. !approximation algorithm.
Approximation Algorithms Chapter Approximation Algorithms Q Suppose I need to solve an NPhard problem What should I do? A Theory says you're unlikely to find a polytime algorithm Must sacrifice one of
More information11. APPROXIMATION ALGORITHMS
11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005
More informationChapter 11. 11.1 Load Balancing. Approximation Algorithms. Load Balancing. Load Balancing on 2 Machines. Load Balancing: Greedy Scheduling
Approximation Algorithms Chapter Approximation Algorithms Q. Suppose I need to solve an NPhard problem. What should I do? A. Theory says you're unlikely to find a polytime algorithm. Must sacrifice one
More information! Solve problem to optimality. ! Solve problem in polytime. ! Solve arbitrary instances of the problem. #approximation algorithm.
Approximation Algorithms 11 Approximation Algorithms Q Suppose I need to solve an NPhard problem What should I do? A Theory says you're unlikely to find a polytime algorithm Must sacrifice one of three
More informationApproximation Algorithms
Approximation Algorithms Q. Suppose I need to solve an NPhard problem. What should I do? A. Theory says you're unlikely to find a polytime algorithm. Must sacrifice one of three desired features. Solve
More informationAlgorithm Design and Analysis
Algorithm Design and Analysis LECTURE 27 Approximation Algorithms Load Balancing Weighted Vertex Cover Reminder: Fill out SRTEs online Don t forget to click submit Sofya Raskhodnikova 12/6/2011 S. Raskhodnikova;
More informationApplied Algorithm Design Lecture 5
Applied Algorithm Design Lecture 5 Pietro Michiardi Eurecom Pietro Michiardi (Eurecom) Applied Algorithm Design Lecture 5 1 / 86 Approximation Algorithms Pietro Michiardi (Eurecom) Applied Algorithm Design
More informationApproximation Algorithms
Approximation Algorithms or: How I Learned to Stop Worrying and Deal with NPCompleteness Ong Jit Sheng, Jonathan (A0073924B) March, 2012 Overview Key Results (I) General techniques: Greedy algorithms
More information2.3 Scheduling jobs on identical parallel machines
2.3 Scheduling jobs on identical parallel machines There are jobs to be processed, and there are identical machines (running in parallel) to which each job may be assigned Each job = 1,,, must be processed
More informationApproximation Algorithms for Weighted Vertex Cover
Approximation Algorithms for Weighted Vertex Cover CS 511 Iowa State University November 7, 2010 CS 511 (Iowa State University) Approximation Algorithms for Weighted Vertex Cover November 7, 2010 1 / 14
More information15451/651: Design & Analysis of Algorithms March 28, 2016 Lecture #19 last changed: March 31, 2016
15451/651: Design & Analysis of Algorithms March 28, 2016 Lecture #19 last changed: March 31, 2016 While we have good algorithms for many optimization problems, the previous lecture showed that many others
More informationOptimization (2MMD10/2DME20) TU/e, fall 2015 Problem set for weeks 4 and 5
Optimization (2MMD10/2DME20) TU/e, fall 2015 Problem set for weeks 4 and 5 Exercise 69. State for each of the following decision problems whether you think that it is contained in NP and/or conp. If you
More informationGuessing Game: NPComplete?
Guessing Game: NPComplete? 1. LONGESTPATH: Given a graph G = (V, E), does there exists a simple path of length at least k edges? YES 2. SHORTESTPATH: Given a graph G = (V, E), does there exists a simple
More informationAn Approximation Scheme for the Knapsack Problem
An Approximation Scheme for the Knapsack Problem CS 511 Iowa State University December 8, 2008 CS 511 (Iowa State University) An Approximation Scheme for the Knapsack Problem December 8, 2008 1 / 12 The
More informationLecture 7,8 (Sep 22 & 24, 2015):Facility Location, KCenter
,8 CMPUT 675: Approximation Algorithms Fall 2015 Lecture 7,8 (Sep 22 & 24, 2015):Facility Location, KCenter Lecturer: Mohammad R. Salavatipour Scribe: Based on older notes 7.1 Uncapacitated Facility Location
More informationApproximation Algorithms. Scheduling. Approximation algorithms. Scheduling jobs on a single machine
Approximation algorithms Approximation Algorithms Fast. Cheap. Reliable. Choose two. NPhard problems: choose 2 of optimal polynomial time all instances Approximation algorithms. Tradeoff between time
More informationLecture 6: Approximation via LP Rounding
Lecture 6: Approximation via LP Rounding Let G = (V, E) be an (undirected) graph. A subset C V is called a vertex cover for G if for every edge (v i, v j ) E we have v i C or v j C (or both). In other
More informationApproximation Algorithms: LP Relaxation, Rounding, and Randomized Rounding Techniques. My T. Thai
Approximation Algorithms: LP Relaxation, Rounding, and Randomized Rounding Techniques My T. Thai 1 Overview An overview of LP relaxation and rounding method is as follows: 1. Formulate an optimization
More informationVertex Cover Problems
Rob van Stee: Approximations und OnlineAlgorithmen 1 Vertex Cover Problems Consider a graph G=(V,E) S V is a vertex cover if {u,v} E : u S v S minimum vertex cover (MINVCP): find a vertex cover S that
More informationWeek 10. c T x + d T y. Ax + Dy = b. x integer.
Week 10 1 Integer Linear Programg So far we considered LP s with continuous variables. However sometimes decisions are intrinsically restricted to integer numbers. E.g., in the long term planning, decision
More informationReductions. designing algorithms proving limits classifying problems NPcompleteness
Reductions designing algorithms proving limits classifying problems NPcompleteness 1 Bird seye view Desiderata. Classify problems according to their computational requirements. Frustrating news. Huge
More informationMinimum Makespan Scheduling
Minimum Makespan Scheduling Minimum makespan scheduling: Definition and variants Factor 2 algorithm for identical machines PTAS for identical machines Factor 2 algorithm for unrelated machines Martin Zachariasen,
More informationGeneralizing Vertex Cover, an Introduction to LPs
CS5234: Combinatorial and Graph Algorithms Lecture 2 Generalizing Vertex Cover, an Introduction to LPs Lecturer: Seth Gilbert August 19, 2014 Abstract Today we consider two generalizations of vertex cover:
More information1 Approximation Algorithms: Vertex Cover
Approximation Algorithms: Vertex Cover. Introduction to Approximation Algorithms There are several optimization problems such as Minimum Spanning Tree (MST), MinCut, Maximum Matching, in which you can
More informationAnswers to some of the exercises.
Answers to some of the exercises. Chapter 2. Ex.2.1 (a) There are several ways to do this. Here is one possibility. The idea is to apply the kcenter algorithm first to D and then for each center in D
More information(where C, X R and are column vectors)
CS 05: Algorithms (Grad) What is a Linear Programming Problem? A linear program (LP) is a minimization problem where we are asked to minimize a given linear function subject to one or more linear inequality
More information1 NPCompleteness (continued)
Mathematical Programming Lecture 27 OR 630 Fall 2006 November 30, 2006 adapted from notes by Omer Hancer 1 NPCompleteness (continued) Before starting on complexity theory, we talked about running times,
More information5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1
5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1 General Integer Linear Program: (ILP) min c T x Ax b x 0 integer Assumption: A, b integer The integrality condition
More information1 Approximation Algorithms
LECTURES 7, 8 and 9 1 Approximation Algorithms NPhardness of many combinatorial optimisation problems gives very little hope that we can ever develop an efficient algorithm for their solution. There are
More information7 Branch and Bound, and Dynamic Programming
7 Branch and Bound, and Dynamic Programming 7.1 Knapsack An important combinatorial optimization problem is the Knapsack Problem, which can be defined as follows: Given nonnegative integers n, c 1,...,
More informationCSC Linear Programming and Combinatorial Optimization Lecture 12: Approximation Algorithms using Tools from LP
S2411  Linear Programming and ombinatorial Optimization Lecture 12: Approximation Algorithms using Tools from LP Notes taken by Matei David April 12, 2005 Summary: In this lecture, we give three example
More informationInternational Journal of Advancements in Technology ISSN
An Evaluation of KCenter Problem Solving Techniques, towards Optimality Rattan Rana, Deepak Garg Department of Computer Science & Engineering Thapar University, Patiala (Punjab) India 147004 Email: rattanrana77@gmail.com,
More information1 The Minimum Spanning Tree (MST) Problem
IEOR269: Lecture 3 1 Integer Programming and Combinatorial Optimization Lecture #3 IEOR 269 Feb 5, 2010 Lecturer: Dorit Hochbaum Scribe: Dogan Bilguvar Handouts: Game of Fiver and Fire Station Problems
More informationCollege of Computer & Information Science Fall 2007 Northeastern University 9 October 2007
College of Computer & Information Science Fall 2007 Northeastern University 9 October 2007 CS G399: Algorithmic Power Tools I Scribe: Laura Poplawski Lecture Outline: Network Design Steiner Forest (primaldual
More informationIterative Rounding and Relaxation
Iterative Rounding and Relaxation 1 / 58 Iterative Rounding and Relaxation James Davis Department of Computer Science Rutgers University Camden jamesdav@camden.rutgers.edu February 12, 2010 Iterative Rounding
More information1 Unrelated Parallel Machine Scheduling
IIIS 014 Spring: ATCS  Selected Topics in Optimization Lecture date: Mar 13, 014 Instructor: Jian Li TA: Lingxiao Huang Scribe: Yifei Jin (Polishing not finished yet.) 1 Unrelated Parallel Machine Scheduling
More informationCS 583: Approximation Algorithms Lecture date: February 3, 2016 Instructor: Chandra Chekuri
CS 583: Approximation Algorithms Lecture date: February 3, 2016 Instructor: Chandra Chekuri Scribe: CC Notes updated from Spring 2011. In the previous lecture we discussed the Knapsack problem. In this
More information1. Lecture notes on bipartite matching
Massachusetts Institute of Technology Handout 3 18.433: Combinatorial Optimization February 9th, 2009 Michel X. Goemans 1. Lecture notes on bipartite matching Matching problems are among the fundamental
More information10.1 Integer Programming and LP relaxation
CS787: Advanced Algorithms Lecture 10: LP Relaxation and Rounding In this lecture we will design approximation algorithms using linear programming. The key insight behind this approach is that the closely
More informationCSCI 3210: Computational Game Theory Linear Programming and 2Player ZeroSum Games Ref: Wikipedia and [AGT] Ch 1
CSCI 3210: Computational Game Theory Linear Programming and 2Player ZeroSum Games Ref: Wikipedia and [AGT] Ch 1 Mohammad T. Irfan Email: mirfan@bowdoin.edu Web: www.bowdoin.edu/~mirfan Course Website:
More information2 Approximation Results for Group Steiner Tree
Advanced Approximation Algorithms (CMU 15854B, Spring 2008) Lecture 10: Group Steiner Tree problem 14 February, 2008 Lecturer: Anupam Gupta Scribe: Amitabh Basu 1 Problem Definition We will be studying
More informationScheduling Shop Scheduling. Tim Nieberg
Scheduling Shop Scheduling Tim Nieberg Shop models: General Introduction Remark: Consider non preemptive problems with regular objectives Notation Shop Problems: m machines, n jobs 1,..., n operations
More informationLPbased solution methods for the asymmetric TSP
LPbased solution methods for the asymmetric TSP Vardges Melkonian Department of Mathematics, Ohio Universtiy, Athens, Ohio 45701, USA vardges@math.ohiou.edu Abstract We consider an LP relaxation for ATSP.
More informationVertex Cover. Linear Progamming and Approximation Algorithms. Joshua Wetzel
Linear Progamming and Approximation Algorithms Joshua Wetzel Department of Computer Science Rutgers University Camden wetzeljo@camden.rutgers.edu March 24, 29 Joshua Wetzel / 52 What is Linear Programming?
More informationLecture 7: Approximation via Randomized Rounding
Lecture 7: Approximation via Randomized Rounding Often LPs return a fractional solution where the solution x, which is supposed to be in {0, } n, is in [0, ] n instead. There is a generic way of obtaining
More informationIntroduction to Operations Research
00 Introduction to Operations Research Peng Zhang June 7, 2015 School of Computer Science and Technology, Shandong University, Jinan, 250101, China. Email: algzhang@sdu.edu.cn. 1 Introduction 1 2 Overview
More informationCSC Linear Programming and Combinatorial Optimization Lecture 9: Ellipsoid Algorithm (Contd.) and Interior Point Methods
CSC2411  Linear Programming and Combinatorial Optimization Lecture 9: Ellipsoid Algorithm (Contd.) and Interior Point Methods Notes taken by Vincent Cheung March 16, 2005 Summary: The ellipsoid algorithm
More information5 TRAVELING SALESMAN PROBLEM
5 TRAVELING SALESMAN PROBLEM PROBLEM DEFINITION AND EXAMPLES TRAVELING SALESMAN PROBLEM, TSP: Find a Hamiltonian cycle of minimum length in a given complete weighted graph G=(V,E) with weights c ij =distance
More information6. DYNAMIC PROGRAMMING II
6. DYNAMIC PROGRAMMING II sequence alignment Hirschberg's algorithm BellmanFord algorithm distance vector protocols negative cycles in a digraph Lecture slides by Kevin Wayne Copyright 2005 PearsonAddison
More informationStanford University CS261: Optimization Handout 7 Luca Trevisan January 25, 2011
Stanford University CS261: Optimization Handout 7 Luca Trevisan January 25, 2011 Lecture 7 In which we show how to use linear programming to approximate the vertex cover problem. 1 Linear Programming Relaxations
More informationScheduling Lecture 1: Introduction
Scheduling Lecture 1: Introduction Loris Marchal CNRS INRIA GRAAL projectteam Laboratoire de l Informatique du Parallélisme École Normale Supérieure de Lyon, France January 28, 2009 1/ 43 Welcome Who
More informationLecture 18&19: Iterative relaxation
CMPUT 675: Topics in Algorithms and Combinatorial Optimization (Fall 2009) Lecture 18&19: Iterative relaxation Lecturer: Mohammad R. Salavatipour Date: Nov 5, 2009 Scriber: Zhong Li 1 Spanning Tree Polytope
More informationParallel machine scheduling under a grade of service provision
Computers & Operations Research 31 (200) 2055 2061 www.elsevier.com/locate/dsw Parallel machine scheduling under a grade of service provision HarkChin Hwang a;1, Soo Y. Chang b;, Kangbok Lee b a Department
More informationEnergy constraint flow for wireless sensor networks. Hans L. Bodlaender Richard Tan Thomas van Dijk Jan van Leeuwen
1 Energy constraint flow for wireless sensor networks Hans L. Bodlaender Richard Tan Thomas van Dijk Jan van Leeuwen 2 Overview 3 Wireless sensor networks Wireles sensor  smart dust  motes Microchip
More informationDynamic Programming and InclusionExclusion.
2MMD30: Graphs and Algorithms Lecture 6 by Jesper Nederlof, 26/02/206 Dynamic Programming and InclusionExclusion. Dynamic programming is a very powerful techniques that you have probably seen already
More informationLecture 12: Steiner Tree Problems
Advanced Approximation Algorithms (CMU 15854B, Spring 008) Lecture 1: Steiner Tree Problems February 1, 008 Lecturer: Anupam Gupta Scribe: Michael Dinitz 1 Finishing up Group Steiner Tree Hardness Last
More informationCS 2336 Discrete Mathematics
CS 2336 Discrete Mathematics Lecture 16 Trees: Introduction 1 What is a Tree? Rooted Trees Properties of Trees Decision Trees Outline 2 What is a Tree? A path or a circuit is simple if it does not contain
More informationInteger Programming Theory
Integer Programming Theory Laura Galli November 11, 2014 In the following we assume all functions are linear, hence we often drop the term linear. In discrete optimization, we seek to find a solution x
More informationLecture More common geometric view of the simplex method
18.409 The Behavior of Algorithms in Practice 9 Apr. 2002 Lecture 14 Lecturer: Dan Spielman Scribe: Brian Sutton In this lecture, we study the worstcase complexity of the simplex method. We conclude by
More informationLecture 3: Linear Programming Relaxations and Rounding
Lecture 3: Linear Programming Relaxations and Rounding 1 Approximation Algorithms and Linear Relaxations For the time being, suppose we have a minimization problem. Many times, the problem at hand can
More informationChapter 6: Random rounding of semidefinite programs
Chapter 6: Random rounding of semidefinite programs Section 6.1: Semidefinite Programming Section 6.: Finding large cuts Extra (not in book): The max sat problem. Section 6.5: Coloring 3colorabale graphs.
More informationStanford University CS261: Optimization Handout 5 Luca Trevisan January 18, 2011
Stanford University CS261: Optimization Handout 5 Luca Trevisan January 18, 2011 Lecture 5 In which we introduce linear programming. 1 Linear Programming A linear program is an optimization problem in
More information5.1 Bipartite Matching
CS787: Advanced Algorithms Lecture 5: Applications of Network Flow In the last lecture, we looked at the problem of finding the maximum flow in a graph, and how it can be efficiently solved using the FordFulkerson
More informationGraph Balancing: A Special Case of Scheduling Unrelated Parallel Machines
Graph Balancing: A Special Case of Scheduling Unrelated Parallel Machines Tomáš Ebenlendr Marek Krčál Jiří Sgall Abstract We design a 1.75approximation algorithm for a special case of scheduling parallel
More informationLecture 11: Integer Linear Programs
Lecture 11: Integer Linear Programs 1 Integer Linear Programs Consider the following version of Woody s problem: x 1 = chairs per day, x 2 = tables per day max z = 35x 1 + 60x 2 profits 14x 1 + 26x 2 190
More information3.3 Easy ILP problems and totally unimodular matrices
3.3 Easy ILP problems and totally unimodular matrices Consider a generic ILP problem expressed in standard form where A Z m n with n m, and b Z m. min{c t x : Ax = b, x Z n +} (1) P(b) = {x R n : Ax =
More informationDecision Aid Methodologies In Transportation Lecture 3: Integer programming
Decision Aid Methodologies In Transportation Lecture 3: Integer programming Chen Jiang Hang Transportation and Mobility Laboratory May 13, 2013 Chen Jiang Hang (Transportation and Mobility Decision Laboratory)
More informationTopic: Bin Packing and Euclidean TSP Date: 2/6/2007
CS880: Approximations Algorithms Scribe: Tom Watson Lecturer: Shuchi Chawla Topic: Bin Packing and Euclidean TSP Date: 2/6/2007 In the previous lecture, we saw how dynamic programming could be employed
More informationChapter 4. Greedy Algorithms
Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright 2005 PearsonAddison Wesley. All rights reserved. Acknowledgement: This lecture slide is revised and authorized from Prof. Kevin Wayne s Class
More informationMATH3902 Operations Research II: Integer Programming Topics
MATH3902 Operations Research II Integer Programming p.0 MATH3902 Operations Research II: Integer Programming Topics 1.1 Introduction.................................................................. 1
More informationCOP 4531 Complexity & Analysis of Data Structures & Algorithms
COP 4531 Complexity & Analysis of Data Structures & Algorithms Greedy Algorithms Thanks to the text authors and K Wayne and Piyush Kumar who contributed to these slides What are greedy algorithms? Similar
More informationINTEGER LINEAR PROGRAMMING  INTRODUCTION
INTEGER LINEAR PROGRAMMING  INTRODUCTION Integer Linear Programming a 11 x 1 + a 12 x 2 + + a 1n x n apple b 1 Integrality Constraint max c 1 x 1 +c 2 x 2 + + c n x n s.t. a 11 x 1 +a 12 x 2 + + a 1n
More informationChapter 2 Integer Programming. Paragraph 1 Total Unimodularity
Chapter 2 Integer Programming Paragraph 1 Total Unimodularity What we did so far We studied algorithms for solving linear programs Simplex (primal, dual, and primaldual) Ellipsoid Method (proving LP is
More informationChapter 11. Greedy Algorithms Problems and Terminology Problem Types. CS 473: Fundamental Algorithms, Spring 2011 March 1, 2011
Chapter 11 Greedy Algorithms CS 473: Fundamental Algorithms, Spring 2011 March 1, 2011 11.1 Problems and Terminology 11.2 Problem Types 11.2.0.1 Problem Types (A) Decision Problem: Is the input a YES or
More informationLPT rule: Whenever a machine becomes free for assignment, assign that job whose. processing time is the largest among those jobs not yet assigned.
LPT rule Whenever a machine becomes free for assignment, assign that job whose processing time is the largest among those jobs not yet assigned. Example m1 m2 m3 J3 Ji J1 J2 J3 J4 J5 J6 6 5 3 3 2 1 3 5
More informationCMPSCI611: Approximating MAXCUT Lecture 20
CMPSCI611: Approximating MAXCUT Lecture 20 For the next two lectures we ll be seeing examples of approximation algorithms for interesting NPhard problems. Today we consider MAXCUT, which we proved to
More informationInteger Programming: Large Scale Methods Part I (Chapter 16)
Integer Programming: Large Scale Methods Part I (Chapter 16) University of Chicago Booth School of Business Kipp Martin November 9, 2016 1 / 55 Outline List of Files Key Concepts Column Generation/Decomposition
More informationLinear Programming. Widget Factory Example. Linear Programming: Standard Form. Widget Factory Example: Continued.
Linear Programming Widget Factory Example Learning Goals. Introduce Linear Programming Problems. Widget Example, Graphical Solution. Basic Theory:, Vertices, Existence of Solutions. Equivalent formulations.
More informationprinceton univ. F 13 cos 521: Advanced Algorithm Design Lecture 6: Provable Approximation via Linear Programming Lecturer: Sanjeev Arora
princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 6: Provable Approximation via Linear Programming Lecturer: Sanjeev Arora Scribe: One of the running themes in this course is the notion of
More information6.854 Advanced Algorithms. Scribes: Jay Kumar Sundararajan. Duality
Lecture 1: 10/13/2004 6.854 Advanced Algorithms Scribes: Jay Kumar Sundararajan Lecturer: David Karger Duality This lecture covers weak and strong duality, and also explains the rules for finding the dual
More informationQuick and dirty approach to solving optimization problems Not necessarily ends up with an optimal solution
Greedy Algorithms Greedy Technique What is it? How does it work? Quick and dirty approach to solving optimization problems Not necessarily ends up with an optimal solution Problems explored Coin changing
More informationMinimum Spanning Tree
Minimum Spanning Tree Minimum spanning tree. Given a connected graph G = (V, E) with real valued edge weights c e, an MST is a subset of the edges T E such that T is a spanning tree whose sum of edge
More informationLecture # 24: Clustering
COMPSCI 634: Geometric Algorithms April 10, 2014 Lecturer: Pankaj Agarwal Lecture # 24: Clustering Scribe: Nat Kell 1 Overview In this lecture, we briefly survey problems in geometric clustering. We define
More informationDuality. Uri Feige. November 17, 2011
Duality Uri Feige November 17, 2011 1 Linear programming duality 1.1 The diet problem revisited Recall the diet problem from Lecture 1. There are n foods, m nutrients, and a person (the buyer) is required
More informationFairness in Routing and Load Balancing
Fairness in Routing and Load Balancing Jon Kleinberg Yuval Rabani Éva Tardos Abstract We consider the issue of network routing subject to explicit fairness conditions. The optimization of fairness criteria
More informationScheduling Parallel Machine Scheduling. Tim Nieberg
Scheduling Parallel Machine Scheduling Tim Nieberg Problem P C max : m machines n jobs with processing times p 1,..., p n Problem P C max : m machines n jobs with processing times p 1,..., p { n 1 if job
More informationSCHEDULING WITH NONRENEWABLE RESOURCES
SCHEDULING WITH NONRENEWABLE RESOURCES by Péter Györgyi Supervisor: Tamás Kis Operations Research Department Eötvös Loránd University Budapest, 2013. Contents Acknowledgement 2 Notations 3 1 Introduction
More informationLecture 9 and 10: Iterative rounding II
Approximation Algorithms and Hardness of Approximation March 19, 013 Lecture 9 and 10: Iterative rounding II Lecturer: Ola Svensson Scribes: Olivier Blanvillain In the last lecture we saw a framework for
More informationAlgorithms for Integer Programming
Algorithms for Integer Programming Laura Galli December 18, 2014 Unlike linear programming problems, integer programming problems are very difficult to solve. In fact, no efficient general algorithm is
More informationApproximation Schemes for Scheduling on Parallel Machines with GoS Levels
The Eighth International Symposium on Operations Research and Its Applications (ISORA 09) Zhangjiajie, China, September 20 22, 2009 Copyright 2009 ORSC & APORC, pp. 53 60 Approximation Schemes for Scheduling
More information8.1 Makespan Scheduling
600.469 / 600.669 Approximation Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programing: MinMakespan and Bin Packing Date: 2/19/15 Scribe: Gabriel Kaptchuk 8.1 Makespan Scheduling Consider an instance
More informationN PHard and N PComplete Problems
N PHard and N PComplete Problems Basic concepts Solvability of algorithms There are algorithms for which there is no known solution, for example, Turing s Halting Problem Decision problem Given an arbitrary
More informationLecture 9: Algorithm complexity and Dynamic programming
Linear and Combinatorial Optimization Fredrik Kahl Matematikcentrum Lecture 9: Algorithm complexity and Dynamic programming Algorithm complexity. The simplex method not polynomial. LP polynomial. Interior
More informationIntroduction to Optimization
Introduction to Optimization Komei Fukuda Institute for Operations Research, and Institute of Theoretical Computer Science ETH Zurich, Switzerland fukuda@ifor.math.ethz.ch Autumn Semester 2011 Contents
More information13.1 SemiDefinite Programming
CS787: Advanced Algorithms Lecture 13: Semidefinite programming In this lecture we will study an extension of linear programming, namely semidefinite programming. Semidefinite programs are much more
More information56:272 Integer Programming & Network Flows Final Exam  Fall 99
56:272 Integer Programming & Network Flows Final Exam  Fall 99 Write your name on the first page, and initial the other pages. Answer all the multiplechoice questions and X of the remaining questions.
More informationDominating Set. Chapter 7
Chapter 7 Dominating Set In this chapter we present another randomized algorithm that demonstrates the power of randomization to break symmetries. We study the problem of finding a small dominating set
More informationLecture 5. Combinatorial Algorithm for maximum cardinality matching (König s Theorem and Hall s Theorem)
Topics in Theoretical Computer Science March 16, 2015 Lecture 5 Lecturer: Ola Svensson Scribes: Mateusz Golebiewski, Maciej Duleba These lecture notes are partly based on http://theory.epfl.ch/courses/topicstcs/lecture10.pdf
More informationLINEAR PROGRAMMING. Vazirani Chapter 12 Introduction to LPDuality. George Alexandridis(NTUA)
LINEAR PROGRAMMING Vazirani Chapter 12 Introduction to LPDuality George Alexandridis(NTUA) gealexan@mail.ntua.gr 1 LINEAR PROGRAMMING What is it? A tool for optimal allocation of scarce resources, among
More information