LINEAR PROGRAMMING. Vazirani Chapter 12 Introduction to LP-Duality. George Alexandridis(NTUA)
|
|
- MargaretMargaret Gallagher
- 1 years ago
- Views:
Transcription
1 LINEAR PROGRAMMING Vazirani Chapter 12 Introduction to LP-Duality George Alexandridis(NTUA) 1
2 LINEAR PROGRAMMING What is it? A tool for optimal allocation of scarce resources, among a number of competing activities. Powerful and general problem-solving method that encompasses: shortest path, network flow, MST, matching Ax = b, 2-person zero sum games Definition Linear Programming is the problem of optimizing (i.eminimizing or maximizing) a linear function subject to linear inequality constraints. The function being optimized is called the objective function Example minimize 7x 1 + x 2 + 5x 3 (the objective function) subject to (the constraints) x 1 -x 2 + 3x x 1 + 2x 2-5x 3 6 x 1, x 2, x 3 0 Any setting for the variables in this linear program that satisfies all the constraints is said to be a feasible solution 2
3 History 1939, Leonid Vitaliyevich Kantorovich Soviet Mathematician and Economist He came up with the technique of Linear Programming after having been assigned to the task of optimizing production in a plywood industry He was awarded with the Nobel Prize in Economics in 1975 for contributions to the theory of the optimum allocation of resources 1947, George Bernard Dantzig American Mathematician He developed the simplex algorithm for solving linear programming problems One of the first LPs to be solved by hand using the simplex method was the Berlin Airlift linear program 1947, John von Neumann Developed the theory of Linear Programming Duality 1979, Leonid Genrikhovich Khachiyan Armenian Mathematician Developed the Ellipsoid Algorithm, which was the first to solve LP in polynomial time 1984, Narendra K. Karmarkar Indian Mathematician Developed the Karmarkar Algorithm, which also solves LP in polynomial time 3
4 Applications Telecommunication Network design, Internet routing Computer science Compiler register allocation, data mining. Electrical engineering VLSI design, optimal clocking. Energy Blending petroleum products. Economics. Equilibrium theory, two-person zero-sum games. Logistics Supply-chain management. 4
5 Example: Profit maximization Dasgupta-Papadimitriou-Vazirani: Algorithms, Chapter 7 A boutique chocolatierproduces two types of chocolate: Pyramide($1 profit apiece) and PyramideNuit($6 profit apiece). How much of each should it produce to maximize profit, given the fact that The daily demand for these chocolates is limited to at most 200 boxes of Pyramideand 300 boxes of PyramideNuit The current workforce can produce a total of at most 400 boxes of chocolate Let s assume that the current daily production is x 1 boxes of Pyramide x 2 boxes of PyramideNuit 5
6 Profit Maximization as a Linear Objective Function max x 1 + 6x 2 Constraints x x x 1 + x x 1, x 2 0 Program 6
7 The simplex algorithm Let vbe any vertex of the feasible region While there is a neighbor v of v with better objective value: Set v= v This localtest implies global optimality 7
8 Introduction to Duality Simplexoutputs (x 1, x 2 ) = (100, 300) as the optimum solution, with an objective value of Can this answer be verified? Multiply the second inequality by six and add it to the first inequality x 1 + 6x The objective function cannot have a value of more than 2000! An even lower bound can be achieved if the second inequality is multiplied by 5 and then added to the third x 1 + 6x Therefore, the multipliers (0, 5, 1) constitute a certificate of optimality for our solution! Do such certificates exist for other LP programs as well? If they do, is there any systematic way of obtaining them? 8
9 The Dual Let s define the multipliers y 1, y 2, y 3 for the three constraints of our problem They must be non-negative in order to maintain the direction of the inequalities After the multiplication and the addition steps, the following bound is obtained (y 1 + y 3 )x1 + (y 2 + y 3 )x 2 200y y y 3 The left hand-side of the inequality must resemble the objective function. Therefore x1 + 6x 2 200y y y 3, if y 1, y 2, y 3 0 y 1 + y 3 1 y 2 + y 3 6 Finding the set of multipliers that give the best upper bound on the original LP is equivalent to solving a new LP! min 200y y y 3, subject to y 1, y 2, y 3 0 y 1 + y 3 1 y 2 + y 3 6 9
10 Primal Dual Any feasible value of this duallp is an upper bound on the original primallp (the reverse also holds) If a pair of primal and dual feasible values are equal, then they are both optimal In our example both (x 1, x 2 ) = (100, 300) and (y 1, y 2, y 3 ) = (0, 5, 1) have value 1900 and therefore certify each other s optimality 10
11 Formal Definition of the Primal and Dual Problem Here, the primal problem is a minimization problem. 11
12 LP Duality Theorem The LP-duality is a min-max relation Corollary 1 LP is well characterized Corollary 2 LP is in NP co-np Feasible solutions to the primal (dual) provide Yes (No) certificates to the question: Is the optimum value less than or equal to α? 12
13 Weak Duality Theorem LP Duality Theorem The basis of several Exact Algorithms Weak Duality Theorem Approximation Algorithms 13
14 Complementary Slackness Conditions 14
15 MAX FLOW as LP: Primal Problem Introduce a fictitious arc from sink tto source s: f ts flow is now converted to circulation maximize flow on this arc The second set of inequalities implies flow conservation at each node If this inequality holds at each node, then, in fact, it must be replaced by an equality at each node This tricky notation is used so as to have a LP in standard form 15
16 MAX FLOW as LP: Dual Problem The variables d ij and p i correspond to the two types of inequalities in the primal d ij : distance labels on arcs p i : potentials on nodes 16
17 MAX FLOW as LP: Transformation of the Dual to an Integer Program The integer program seeks 0/1 solutions If (d *,p * )is an optimal solution to IP, then the 2 nd inequality is satisfied only if p s* = 1 and p t* = 0, thus defining an s-t cut Sis the set of potential 1nodes and V / S the set of potential 0nodes If the nodes of an arc belong to the different sets ( ), then by the first inequality d ij* = 1 The distance label for each of the remaining arcs may be set to 0 or 1 without violation of the constraint it will be set to 0 in order to minimize the objective function Therefore the objective function will be equal to the capacity of the cut, thus defining a minimum s-t cut 17
18 MAX FLOW as LP: LP-relaxation The previous IP is a formulation of the minimum s-t cut problem! The dual program may be seen as a relaxation of the IP the integrality constraint is dropped 1 d ij 0, for every arc of the graph 1 p i 0, for every node of the graph the upper bound constraints on the variables are redundant Their omission cannot give a better solution The dual program is said to be an LP-Relaxationof the IP Any feasible solution to the dual problem is considered to be a fractional s-t cut Indeed, the distance labels on any s-t path add up to at least 1 The capacityof the fractional s-tcut is then defined to be the dual objective function value achieved by it 18
19 The max-flow min-cut theorem as a special case of LP-Duality A polyhedron defines the set of feasible solutions to the dual program A feasible solution is set to be an extreme point solution if it is a vertex of the polyhedron It cannot be expressed as a convex combination of two feasible solutions LP theory: there is an extreme point solution that is optimal It can be further proven that each extreme point solution of the polyhedron is integral with each coordinate being 0 or 1 The dual problem has always an integral optimal solution By the LP duality theory, maximum flow in G must equal the capacity of a minimum fractional s-t cut Since the latter equals the capacity of a minimum s-t cut, we get the max-flow min-cut theorem 19
20 Another consequence of the LP- Duality Theorem: The Farkas Lemma Ax = b, x 0 has a solution iffthere is no vector y 0 sa sfying A T y 0 and b T y> 0 Primal: min 0 T x Ax = b x 0 Dual: max b T y A T y 0 y 0 If the primal is feasible, it has an optimal point of cost 0 y = 0 is feasible in the dual and therefore it is either unbounded or has an optimal point First direction If Ax = b has an non-negative solution, then the primal is feasible and its optimal cost is 0. Therefore, the dual s optimal cost is 0 and there can be no vector y satisfying the dual s first constraint and b T y> 0 Second direction If there is no vector y satisfying A T y 0 and b T y> 0,then the dual is not unbounded. It has an optimal point. As a consequence the primal has an optimal point and therefore it is feasible 20
21 LP Duality in 2-person zero sum games m t e 3-1 s -2 1 Non-symmetric game In this scenario, if Row announces a strategy x = (x 1, x 2 ),there is always a pure strategy that is optimal for Column m, with payoff 3x 1 2x 2 t, with payoff x 1 + x 2 Any mixed strategy yfor Column is a weighted average of the abovementioned pure strategy and therefore it cannot be better of them If Row is forced to announce her strategy, she wants to defensively pick an x that would maximize her payoff against Column s best response Pick (x1, x2) that maximizes the min{3x 1 2x 2, x 1 + x 2 }(which is the payoff from Column s best response to x 21
22 LP of the 2-person zero sum game If Row, announces her strategy first, she needs to pick x 1 and x 2 so that z= min{3x 1 2x 2, x 1 + x 2 } max z z 3x 1 2x 2 z x 1 + x 2 In LP form max z x 1 2x 2 + z 0 x 1 x 2 + z 0 x 1 + x 2 = 1 x 1, x 2 0 If Column, announces his strategy first, he needs to pick y 1 and y 2 so that w= max{3y 1 y 2, 2y 1 + y 2 } min w w 3y 1 y 2 w 2y 1 + y 2 In LP form min w 3y 1 y 2 + w 0-2y 1 + y 2 + w 0 y 1 + y 2 = 1 y 1, y 2 0 These two LPs are dual to each other They have the same optimum V 22
23 The min-max theorem of game theory By solving an LP, the maximizer(row) can determine a strategy for herself that guarantees an expected outcome of at least V no matter what Column does The minimizer, by solving the dual LP, can guarantee an expected outcome of at most V, no matter what Row does This is the uniquely defined optimal play and Vis the value of the game It wasn t a priori certain that such a play existed This example cat be generalized to arbitrary games It proves the existence of mixed strategies that are optimal for both players Both players achieve the same value V This is the min-max theoremof game theory 23
24 Linear Programming in Approximation Algorithms Many combinatorial optimization problems can be stated as integer problems The linear relaxation of this program then provides a lower bound on the cost of the optimal solution In NP-hard problems, the polyhedron defining the optimal solution does not have integer vertices In that case a near-optimal solution is sought Two basic techniques for obtaining approximation algorithms using LP LP-rounding Primal-Dual Schema LP-duality theory has been used in combinatoriallyobtained approximation algorithms Method of dual fitting (Chapter 13) 24
25 LP-Rounding and Primal-Dual Schema LP-Rounding Solve the LP Convert the fractional solution obtained into an integral solution Ensuring in the process that the cost does not increase much Primal-Dual Schema Use the dual of the LP-relaxation (in which case becomes the primal) in the design of the algorithm An integral solution to the primal and a feasible solution to the duela are constructed iteratively Any feasible solution to the dual provides a lower bound for OPT These techniques are illustrated in the case of SET COVER, in Chapter 14 and 15 of the book 25
26 The integrality gap of an LP-relaxation Given an LP-relaxation of a minimizationproblem Π, let OPT f (I)be the optimal fractional solution to instance I The integrality gap is then defined to be The supremumof the ratio of the optimal integral and fractional solutions In case of a maximization problem, it would have been the infimum of this ratio If the cost of the solution found by the algorithm is compared directly with the cost of an optimal fractional solution, then the best approximation factor is the integrality gap of the relaxation 26
27 Running times of the two techniques LP-rounding needs to find an optimal solution to the linear programming relaxation LP is in P and therefore this can be done in polynomial time if the relaxation has polynomially many constraints. Even if the relaxation has exponentially many constraints, it may still be solved in polynomial time if a polynomial time separation oracle can be constructed A polynomial time algorithm that given a point in R n (n: the number of variables in the relaxation) either confirms that it is a feasible solution or outputs a violated constraint. The primal-dual schema may exploit the special combinatorial structure of individual problems and is able to yield algorithms having good running times Once a basic problem is solved, variants and generalizations of the basic problem can be solved too 27
max cx s.t. Ax c where the matrix A, cost vector c and right hand side b are given and x is a vector of variables. For this example we have x
Linear Programming Linear programming refers to problems stated as maximization or minimization of a linear function subject to constraints that are linear equalities and inequalities. Although the study
Lecture 7 - Linear Programming
COMPSCI 530: Design and Analysis of Algorithms DATE: 09/17/2013 Lecturer: Debmalya Panigrahi Lecture 7 - Linear Programming Scribe: Hieu Bui 1 Overview In this lecture, we cover basics of linear programming,
4.6 Linear Programming duality
4.6 Linear Programming duality To any minimization (maximization) LP we can associate a closely related maximization (minimization) LP. Different spaces and objective functions but in general same optimal
Linear Programming is the branch of applied mathematics that deals with solving
Chapter 2 LINEAR PROGRAMMING PROBLEMS 2.1 Introduction Linear Programming is the branch of applied mathematics that deals with solving optimization problems of a particular functional form. A linear programming
1 Polyhedra and Linear Programming
CS 598CSC: Combinatorial Optimization Lecture date: January 21, 2009 Instructor: Chandra Chekuri Scribe: Sungjin Im 1 Polyhedra and Linear Programming In this lecture, we will cover some basic material
Linear Programming: Introduction
Linear Programming: Introduction Frédéric Giroire F. Giroire LP - Introduction 1/28 Course Schedule Session 1: Introduction to optimization. Modelling and Solving simple problems. Modelling combinatorial
By W.E. Diewert. July, Linear programming problems are important for a number of reasons:
APPLIED ECONOMICS By W.E. Diewert. July, 3. Chapter : Linear Programming. Introduction The theory of linear programming provides a good introduction to the study of constrained maximization (and minimization)
Lecture 3: Linear Programming Relaxations and Rounding
Lecture 3: Linear Programming Relaxations and Rounding 1 Approximation Algorithms and Linear Relaxations For the time being, suppose we have a minimization problem. Many times, the problem at hand can
Linear Programming: Chapter 5 Duality
Linear Programming: Chapter 5 Duality Robert J. Vanderbei October 17, 2007 Operations Research and Financial Engineering Princeton University Princeton, NJ 08544 http://www.princeton.edu/ rvdb Resource
LECTURE 5: DUALITY AND SENSITIVITY ANALYSIS. 1. Dual linear program 2. Duality theory 3. Sensitivity analysis 4. Dual simplex method
LECTURE 5: DUALITY AND SENSITIVITY ANALYSIS 1. Dual linear program 2. Duality theory 3. Sensitivity analysis 4. Dual simplex method Introduction to dual linear program Given a constraint matrix A, right
Linear programming and reductions
Chapter 7 Linear programming and reductions Many of the problems for which we want algorithms are optimization tasks: the shortest path, the cheapest spanning tree, the longest increasing subsequence,
Linear Programming. Widget Factory Example. Linear Programming: Standard Form. Widget Factory Example: Continued.
Linear Programming Widget Factory Example Learning Goals. Introduce Linear Programming Problems. Widget Example, Graphical Solution. Basic Theory:, Vertices, Existence of Solutions. Equivalent formulations.
Min-cost flow problems and network simplex algorithm
Min-cost flow problems and network simplex algorithm The particular structure of some LP problems can be sometimes used for the design of solution techniques more efficient than the simplex algorithm.
Minimize subject to. x S R
Chapter 12 Lagrangian Relaxation This chapter is mostly inspired by Chapter 16 of [1]. In the previous chapters, we have succeeded to find efficient algorithms to solve several important problems such
Using the Simplex Method in Mixed Integer Linear Programming
Integer Using the Simplex Method in Mixed Integer UTFSM Nancy, 17 december 2015 Using the Simplex Method in Mixed Integer Outline Mathematical Programming Integer 1 Mathematical Programming Optimisation
AM 221: Advanced Optimization Spring Prof. Yaron Singer Lecture 7 February 19th, 2014
AM 22: Advanced Optimization Spring 204 Prof Yaron Singer Lecture 7 February 9th, 204 Overview In our previous lecture we saw the application of the strong duality theorem to game theory, and then saw
1 Introduction. Linear Programming. Questions. A general optimization problem is of the form: choose x to. max f(x) subject to x S. where.
Introduction Linear Programming Neil Laws TT 00 A general optimization problem is of the form: choose x to maximise f(x) subject to x S where x = (x,..., x n ) T, f : R n R is the objective function, S
Chapter 11. 11.1 Load Balancing. Approximation Algorithms. Load Balancing. Load Balancing on 2 Machines. Load Balancing: Greedy Scheduling
Approximation Algorithms Chapter Approximation Algorithms Q. Suppose I need to solve an NP-hard problem. What should I do? A. Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one
Can linear programs solve NP-hard problems?
Can linear programs solve NP-hard problems? p. 1/9 Can linear programs solve NP-hard problems? Ronald de Wolf Linear programs Can linear programs solve NP-hard problems? p. 2/9 Can linear programs solve
Chapter 15 Introduction to Linear Programming
Chapter 15 Introduction to Linear Programming An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Brief History of Linear Programming The goal of linear programming is to determine the values of
Week 5 Integral Polyhedra
Week 5 Integral Polyhedra We have seen some examples 1 of linear programming formulation that are integral, meaning that every basic feasible solution is an integral vector. This week we develop a theory
3. Linear Programming and Polyhedral Combinatorics
Massachusetts Institute of Technology Handout 6 18.433: Combinatorial Optimization February 20th, 2009 Michel X. Goemans 3. Linear Programming and Polyhedral Combinatorics Summary of what was seen in the
1 Solving LPs: The Simplex Algorithm of George Dantzig
Solving LPs: The Simplex Algorithm of George Dantzig. Simplex Pivoting: Dictionary Format We illustrate a general solution procedure, called the simplex algorithm, by implementing it on a very simple example.
5.1 Bipartite Matching
CS787: Advanced Algorithms Lecture 5: Applications of Network Flow In the last lecture, we looked at the problem of finding the maximum flow in a graph, and how it can be efficiently solved using the Ford-Fulkerson
Definition of a Linear Program
Definition of a Linear Program Definition: A function f(x 1, x,..., x n ) of x 1, x,..., x n is a linear function if and only if for some set of constants c 1, c,..., c n, f(x 1, x,..., x n ) = c 1 x 1
LINEAR PROGRAMMING P V Ram B. Sc., ACA, ACMA Hyderabad
LINEAR PROGRAMMING P V Ram B. Sc., ACA, ACMA 98481 85073 Hyderabad Page 1 of 19 Question: Explain LPP. Answer: Linear programming is a mathematical technique for determining the optimal allocation of resources
26 Linear Programming
The greatest flood has the soonest ebb; the sorest tempest the most sudden calm; the hottest love the coldest end; and from the deepest desire oftentimes ensues the deadliest hate. Th extremes of glory
! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. #-approximation algorithm.
Approximation Algorithms 11 Approximation Algorithms Q Suppose I need to solve an NP-hard problem What should I do? A Theory says you're unlikely to find a poly-time algorithm Must sacrifice one of three
Linear Programming, Lagrange Multipliers, and Duality Geoff Gordon
lp.nb 1 Linear Programming, Lagrange Multipliers, and Duality Geoff Gordon lp.nb 2 Overview This is a tutorial about some interesting math and geometry connected with constrained optimization. It is not
The multi-integer set cover and the facility terminal cover problem
The multi-integer set cover and the facility teral cover problem Dorit S. Hochbaum Asaf Levin December 5, 2007 Abstract The facility teral cover problem is a generalization of the vertex cover problem.
! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. !-approximation algorithm.
Approximation Algorithms Chapter Approximation Algorithms Q Suppose I need to solve an NP-hard problem What should I do? A Theory says you're unlikely to find a poly-time algorithm Must sacrifice one of
Theory of Linear Programming
Theory of Linear Programming Debasis Mishra April 6, 2011 1 Introduction Optimization of a function f over a set S involves finding the maximum (minimum) value of f (objective function) in the set S (feasible
Good luck, veel succes!
Final exam Advanced Linear Programming, May 7, 13.00-16.00 Switch off your mobile phone, PDA and any other mobile device and put it far away. No books or other reading materials are allowed. This exam
The Simplex Method. yyye
Yinyu Ye, MS&E, Stanford MS&E310 Lecture Note #05 1 The Simplex Method Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/
Stanford University CS261: Optimization Handout 6 Luca Trevisan January 20, In which we introduce the theory of duality in linear programming.
Stanford University CS261: Optimization Handout 6 Luca Trevisan January 20, 2011 Lecture 6 In which we introduce the theory of duality in linear programming 1 The Dual of Linear Program Suppose that we
Practical Guide to the Simplex Method of Linear Programming
Practical Guide to the Simplex Method of Linear Programming Marcel Oliver Revised: April, 0 The basic steps of the simplex algorithm Step : Write the linear programming problem in standard form Linear
Quiz 1 Sample Questions IE406 Introduction to Mathematical Programming Dr. Ralphs
Quiz 1 Sample Questions IE406 Introduction to Mathematical Programming Dr. Ralphs These questions are from previous years and should you give you some idea of what to expect on Quiz 1. 1. Consider the
Approximation Algorithms
Approximation Algorithms or: How I Learned to Stop Worrying and Deal with NP-Completeness Ong Jit Sheng, Jonathan (A0073924B) March, 2012 Overview Key Results (I) General techniques: Greedy algorithms
Inverse Optimization by James Orlin
Inverse Optimization by James Orlin based on research that is joint with Ravi Ahuja Jeopardy 000 -- the Math Programming Edition The category is linear objective functions The answer: When you maximize
Equilibrium computation: Part 1
Equilibrium computation: Part 1 Nicola Gatti 1 Troels Bjerre Sorensen 2 1 Politecnico di Milano, Italy 2 Duke University, USA Nicola Gatti and Troels Bjerre Sørensen ( Politecnico di Milano, Italy, Equilibrium
Permutation Betting Markets: Singleton Betting with Extra Information
Permutation Betting Markets: Singleton Betting with Extra Information Mohammad Ghodsi Sharif University of Technology ghodsi@sharif.edu Hamid Mahini Sharif University of Technology mahini@ce.sharif.edu
The Goldberg Rao Algorithm for the Maximum Flow Problem
The Goldberg Rao Algorithm for the Maximum Flow Problem COS 528 class notes October 18, 2006 Scribe: Dávid Papp Main idea: use of the blocking flow paradigm to achieve essentially O(min{m 2/3, n 1/2 }
IEOR 4404 Homework #2 Intro OR: Deterministic Models February 14, 2011 Prof. Jay Sethuraman Page 1 of 5. Homework #2
IEOR 4404 Homework # Intro OR: Deterministic Models February 14, 011 Prof. Jay Sethuraman Page 1 of 5 Homework #.1 (a) What is the optimal solution of this problem? Let us consider that x 1, x and x 3
Section Notes 3. The Simplex Algorithm. Applied Math 121. Week of February 14, 2011
Section Notes 3 The Simplex Algorithm Applied Math 121 Week of February 14, 2011 Goals for the week understand how to get from an LP to a simplex tableau. be familiar with reduced costs, optimal solutions,
Definition 11.1. Given a graph G on n vertices, we define the following quantities:
Lecture 11 The Lovász ϑ Function 11.1 Perfect graphs We begin with some background on perfect graphs. graphs. First, we define some quantities on Definition 11.1. Given a graph G on n vertices, we define
Applied Algorithm Design Lecture 5
Applied Algorithm Design Lecture 5 Pietro Michiardi Eurecom Pietro Michiardi (Eurecom) Applied Algorithm Design Lecture 5 1 / 86 Approximation Algorithms Pietro Michiardi (Eurecom) Applied Algorithm Design
Nan Kong, Andrew J. Schaefer. Department of Industrial Engineering, Univeristy of Pittsburgh, PA 15261, USA
A Factor 1 2 Approximation Algorithm for Two-Stage Stochastic Matching Problems Nan Kong, Andrew J. Schaefer Department of Industrial Engineering, Univeristy of Pittsburgh, PA 15261, USA Abstract We introduce
Proximal mapping via network optimization
L. Vandenberghe EE236C (Spring 23-4) Proximal mapping via network optimization minimum cut and maximum flow problems parametric minimum cut problem application to proximal mapping Introduction this lecture:
Optimization Modeling for Mining Engineers
Optimization Modeling for Mining Engineers Alexandra M. Newman Division of Economics and Business Slide 1 Colorado School of Mines Seminar Outline Linear Programming Integer Linear Programming Slide 2
Linear Programming I
Linear Programming I November 30, 2003 1 Introduction In the VCR/guns/nuclear bombs/napkins/star wars/professors/butter/mice problem, the benevolent dictator, Bigus Piguinus, of south Antarctica penguins
Linear Programming. March 14, 2014
Linear Programming March 1, 01 Parts of this introduction to linear programming were adapted from Chapter 9 of Introduction to Algorithms, Second Edition, by Cormen, Leiserson, Rivest and Stein [1]. 1
Approximation Algorithms: LP Relaxation, Rounding, and Randomized Rounding Techniques. My T. Thai
Approximation Algorithms: LP Relaxation, Rounding, and Randomized Rounding Techniques My T. Thai 1 Overview An overview of LP relaxation and rounding method is as follows: 1. Formulate an optimization
Solving Mixed Integer Linear Programs Using Branch and Cut Algorithm
1 Solving Mixed Integer Linear Programs Using Branch and Cut Algorithm by Shon Albert A Project Submitted to the Graduate Faculty of North Carolina State University in Partial Fulfillment of the Requirements
Mathematical finance and linear programming (optimization)
Mathematical finance and linear programming (optimization) Geir Dahl September 15, 2009 1 Introduction The purpose of this short note is to explain how linear programming (LP) (=linear optimization) may
Simplex Method. Introduction:
Introduction: In the previous chapter, we discussed about the graphical method for solving linear programming problems. Although the graphical method is an invaluable aid to understand the properties of
9.4 THE SIMPLEX METHOD: MINIMIZATION
SECTION 9 THE SIMPLEX METHOD: MINIMIZATION 59 The accounting firm in Exercise raises its charge for an audit to $5 What number of audits and tax returns will bring in a maximum revenue? In the simplex
Lecture 3. Linear Programming. 3B1B Optimization Michaelmas 2015 A. Zisserman. Extreme solutions. Simplex method. Interior point method
Lecture 3 3B1B Optimization Michaelmas 2015 A. Zisserman Linear Programming Extreme solutions Simplex method Interior point method Integer programming and relaxation The Optimization Tree Linear Programming
The Simplex Method. What happens when we need more decision variables and more problem constraints?
The Simplex Method The geometric method of solving linear programming problems presented before. The graphical method is useful only for problems involving two decision variables and relatively few problem
2.3 Scheduling jobs on identical parallel machines
2.3 Scheduling jobs on identical parallel machines There are jobs to be processed, and there are identical machines (running in parallel) to which each job may be assigned Each job = 1,,, must be processed
Jianlin Cheng, PhD Computer Science Department University of Missouri, Columbia Fall, 2013
Jianlin Cheng, PhD Computer Science Department University of Missouri, Columbia Fall, 2013 Princeton s class notes on linear programming MIT s class notes on linear programming Xian Jiaotong University
2.3 Convex Constrained Optimization Problems
42 CHAPTER 2. FUNDAMENTAL CONCEPTS IN CONVEX OPTIMIZATION Theorem 15 Let f : R n R and h : R R. Consider g(x) = h(f(x)) for all x R n. The function g is convex if either of the following two conditions
6. Mixed Integer Linear Programming
6. Mixed Integer Linear Programming Javier Larrosa Albert Oliveras Enric Rodríguez-Carbonell Problem Solving and Constraint Programming (RPAR) Session 6 p.1/40 Mixed Integer Linear Programming A mixed
Optimization in R n Introduction
Optimization in R n Introduction Rudi Pendavingh Eindhoven Technical University Optimization in R n, lecture Rudi Pendavingh (TUE) Optimization in R n Introduction ORN / 4 Some optimization problems designing
Linear Programming: Chapter 11 Game Theory
Linear Programming: Chapter 11 Game Theory Robert J. Vanderbei October 17, 2007 Operations Research and Financial Engineering Princeton University Princeton, NJ 08544 http://www.princeton.edu/ rvdb Rock-Paper-Scissors
Outline. Linear Programming (LP): Simplex Search. Simplex: An Extreme-Point Search Algorithm. Basic Solutions
Outline Linear Programming (LP): Simplex Search Benoît Chachuat McMaster University Department of Chemical Engineering ChE 4G03: Optimization in Chemical Engineering 1 Basic Solutions
I. Solving Linear Programs by the Simplex Method
Optimization Methods Draft of August 26, 2005 I. Solving Linear Programs by the Simplex Method Robert Fourer Department of Industrial Engineering and Management Sciences Northwestern University Evanston,
5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1
5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1 General Integer Linear Program: (ILP) min c T x Ax b x 0 integer Assumption: A, b integer The integrality condition
The Transportation Problem: LP Formulations
The Transportation Problem: LP Formulations An LP Formulation Suppose a company has m warehouses and n retail outlets A single product is to be shipped from the warehouses to the outlets Each warehouse
5 2x 3y= 9 4x+7y=56. 3x+4y=16 4x 7y= x
Systems of Inequalities 1 Given system of inequalities of the form Ax b Goals: determine if system has an integer solution enumerate all integer solutions 2 bounds for x: (2) and (3) Upper bounds for x:
Linear Programming Notes V Problem Transformations
Linear Programming Notes V Problem Transformations 1 Introduction Any linear programming problem can be rewritten in either of two standard forms. In the first form, the objective is to maximize, the material
A Labeling Algorithm for the Maximum-Flow Network Problem
A Labeling Algorithm for the Maximum-Flow Network Problem Appendix C Network-flow problems can be solved by several methods. In Chapter 8 we introduced this topic by exploring the special structure of
SUPPLEMENT TO CHAPTER
SUPPLEMENT TO CHAPTER 6 Linear Programming SUPPLEMENT OUTLINE Introduction and Linear Programming Model, 2 Graphical Solution Method, 5 Computer Solutions, 14 Sensitivity Analysis, 17 Key Terms, 22 Solved
Chapter 3 LINEAR PROGRAMMING GRAPHICAL SOLUTION 3.1 SOLUTION METHODS 3.2 TERMINOLOGY
Chapter 3 LINEAR PROGRAMMING GRAPHICAL SOLUTION 3.1 SOLUTION METHODS Once the problem is formulated by setting appropriate objective function and constraints, the next step is to solve it. Solving LPP
Lecture 7: Approximation via Randomized Rounding
Lecture 7: Approximation via Randomized Rounding Often LPs return a fractional solution where the solution x, which is supposed to be in {0, } n, is in [0, ] n instead. There is a generic way of obtaining
Linear Inequalities and Linear Programming. Systems of Linear Inequalities in Two Variables
Linear Inequalities and Linear Programming 5.1 Systems of Linear Inequalities 5.2 Linear Programming Geometric Approach 5.3 Geometric Introduction to Simplex Method 5.4 Maximization with constraints 5.5
Minimizing costs for transport buyers using integer programming and column generation. Eser Esirgen
MASTER STHESIS Minimizing costs for transport buyers using integer programming and column generation Eser Esirgen DepartmentofMathematicalSciences CHALMERS UNIVERSITY OF TECHNOLOGY UNIVERSITY OF GOTHENBURG
Routing in Line Planning for Public Transport
Konrad-Zuse-Zentrum für Informationstechnik Berlin Takustraße 7 D-14195 Berlin-Dahlem Germany MARC E. PFETSCH RALF BORNDÖRFER Routing in Line Planning for Public Transport Supported by the DFG Research
Duration Must be Job (weeks) Preceeded by
1. Project Scheduling. This problem deals with the creation of a project schedule; specifically, the project of building a house. The project has been divided into a set of jobs. The problem is to schedule
LINEAR PROGRAMMING PROBLEM: A GEOMETRIC APPROACH
59 LINEAR PRGRAMMING PRBLEM: A GEMETRIC APPRACH 59.1 INTRDUCTIN Let us consider a simple problem in two variables x and y. Find x and y which satisfy the following equations x + y = 4 3x + 4y = 14 Solving
INTEGER PROGRAMMING. Integer Programming. Prototype example. BIP model. BIP models
Integer Programming INTEGER PROGRAMMING In many problems the decision variables must have integer values. Example: assign people, machines, and vehicles to activities in integer quantities. If this is
1. (a) Multiply by negative one to make the problem into a min: 170A 170B 172A 172B 172C Antonovics Foster Groves 80 88
Econ 172A, W2001: Final Examination, Possible Answers There were 480 possible points. The first question was worth 80 points (40,30,10); the second question was worth 60 points (10 points for the first
56:171. Operations Research -- Sample Homework Assignments Fall 1992 Dennis Bricker Dept. of Industrial Engineering University of Iowa.
56:171 Operations Research -- Sample Homework Assignments Fall 1992 Dennis Bricker Dept. of Industrial Engineering University of Iowa Homework #1 (1.) Linear Programming Model Formulation. SunCo processes
Nonlinear Programming Methods.S2 Quadratic Programming
Nonlinear Programming Methods.S2 Quadratic Programming Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard A linearly constrained optimization problem with a quadratic objective
Lecture 5: Conic Optimization: Overview
EE 227A: Conve Optimization and Applications January 31, 2012 Lecture 5: Conic Optimization: Overview Lecturer: Laurent El Ghaoui Reading assignment: Chapter 4 of BV. Sections 3.1-3.6 of WTB. 5.1 Linear
1 Linear Programming. 1.1 Introduction. Problem description: motivate by min-cost flow. bit of history. everything is LP. NP and conp. P breakthrough.
1 Linear Programming 1.1 Introduction Problem description: motivate by min-cost flow bit of history everything is LP NP and conp. P breakthrough. general form: variables constraints: linear equalities
Permutation Betting Markets: Singleton Betting with Extra Information
Permutation Betting Markets: Singleton Betting with Extra Information Mohammad Ghodsi Sharif University of Technology ghodsi@sharif.edu Hamid Mahini Sharif University of Technology mahini@ce.sharif.edu
Online Adwords Allocation
Online Adwords Allocation Shoshana Neuburger May 6, 2009 1 Overview Many search engines auction the advertising space alongside search results. When Google interviewed Amin Saberi in 2004, their advertisement
CHAPTER 9. Integer Programming
CHAPTER 9 Integer Programming An integer linear program (ILP) is, by definition, a linear program with the additional constraint that all variables take integer values: (9.1) max c T x s t Ax b and x integral
Answers to some of the exercises.
Answers to some of the exercises. Chapter 2. Ex.2.1 (a) There are several ways to do this. Here is one possibility. The idea is to apply the k-center algorithm first to D and then for each center in D
11. APPROXIMATION ALGORITHMS
11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005
The Equivalence of Linear Programs and Zero-Sum Games
The Equivalence of Linear Programs and Zero-Sum Games Ilan Adler, IEOR Dep, UC Berkeley adler@ieor.berkeley.edu Abstract In 1951, Dantzig showed the equivalence of linear programming problems and two-person
Minimum Caterpillar Trees and Ring-Stars: a branch-and-cut algorithm
Minimum Caterpillar Trees and Ring-Stars: a branch-and-cut algorithm Luidi G. Simonetti Yuri A. M. Frota Cid C. de Souza Institute of Computing University of Campinas Brazil Aussois, January 2010 Cid de
ARTICLE IN PRESS. European Journal of Operational Research xxx (2004) xxx xxx. Discrete Optimization. Nan Kong, Andrew J.
A factor 1 European Journal of Operational Research xxx (00) xxx xxx Discrete Optimization approximation algorithm for two-stage stochastic matching problems Nan Kong, Andrew J. Schaefer * Department of
Geometry of Linear Programming
Chapter 2 Geometry of Linear Programming The intent of this chapter is to provide a geometric interpretation of linear programming problems. To conceive fundamental concepts and validity of different algorithms
Discrete (and Continuous) Optimization Solutions of Exercises 1 WI4 131
Discrete (and Continuous) Optimization Solutions of Exercises 1 WI4 131 Kees Roos Technische Universiteit Delft Faculteit Informatietechnologie en Systemen Afdeling Informatie, Systemen en Algoritmiek
Algorithm Design and Analysis
Algorithm Design and Analysis LECTURE 27 Approximation Algorithms Load Balancing Weighted Vertex Cover Reminder: Fill out SRTEs online Don t forget to click submit Sofya Raskhodnikova 12/6/2011 S. Raskhodnikova;
Minimum Makespan Scheduling
Minimum Makespan Scheduling Minimum makespan scheduling: Definition and variants Factor 2 algorithm for identical machines PTAS for identical machines Factor 2 algorithm for unrelated machines Martin Zachariasen,
18. Linear optimization
18. Linear optimization EE103 (Fall 2011-12) linear program examples geometrical interpretation extreme points simplex method 18-1 Linear program minimize c 1 x 1 +c 2 x 2 + +c n x n subject to a 11 x
LINEAR PROGRAMMING MODULE Part 2 - Solution Algorithm
LINEAR PROGRAMMING MODULE Part 2 - Solution Algorithm Name: THE SIMPLEX SOLUTION METHOD It should be obvious to the reader by now that the graphical method for solving linear programs is limited to models
The Multiplicative Weights Update method
Chapter 2 The Multiplicative Weights Update method The Multiplicative Weights method is a simple idea which has been repeatedly discovered in fields as diverse as Machine Learning, Optimization, and Game