M AT H E M AT I C S O F O P - E R AT I O N S R E S E A R C H

Size: px
Start display at page:

Download "M AT H E M AT I C S O F O P - E R AT I O N S R E S E A R C H"

Transcription

1 A N D R E W T U L L O C H M AT H E M AT I C S O F O P - E R AT I O N S R E S E A R C H T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E

2

3 Contents 1 Generalities 5 2 Constrained Optimization Lagrangian Multipliers Lagrange Dual Supporting Hyperplanes 8 3 Linear Programming Convexity and Strong Duality Linear Programs Linear Program Duality Complementary Slackness 13 4 Simplex Method Basic Solutions Extreme Points and Optimal Solutions The Simplex Tableau The Simplex Method in Tableau Form 16

4 4 andrew tulloch 5 Advanced Simplex Procedures The Two-Phase Simplex Method Gomory s Cutting Plane Method 17 6 Complexity of Problems and Algorithms Asymptotic Complexity 19 7 The Complexity of Linear Programming A Lower Bound for the Simplex Method The Idea for a New Method 21 8 Graphs and Flows Introduction Minimal Cost Flows 23 9 Transportation and Assignment Problems Non-Cooperative Games Strategic Equilibrium Bibliography 31

5 1 Generalities

6

7 2 Constrained Optimization Minimize f (x) subject to h(x) = b, x X. Objective function f : R n R Vector x R n of decision variables, Functional constraint where h : R n > R m, b R m Regional constraint where X R n. Definition 2.1. The feasible set is X(b) = {x X : h(x) = b}. An inequality of the form g(x) b can be written as g(x) + z = b, where z R m called a slack variable with regional constraint z Lagrangian Multipliers Definition 2.2. Define the Lagrangian of a problem as L(x, λ) = f (x) λ T (h(x) b) (2.1) where λ R m is a vector of Lagrange multipliers Theorem 2.3 (Lagrangian Sufficiency Theorem). Let x X and λ R m such that and h(x) = b. Then x is optimal for P. L(x, λ) = inf x X L(x, λ) (2.2)

8 8 andrew tulloch Proof. min f x X(b) (x ) = min [ f (x) x X(b) λt (h(x ) b)] (2.3) min x X [ f (x ) λ T (h(x ) b)] (2.4) = f (x) λ T (h(x) b) (2.5) = f (x) (2.6) 2.2 Lagrange Dual Definition 2.4. Let φ(b) = inf f (x). (2.7) x X(b) Define the Lagrange dual function g : R m R with g(λ) = in f x X L(x, λ) (2.8) Then, for all λ R m, inf f (x) = inf L(x, λ) inf L(x, λ) = g(λ) (2.9) x X(b) x X(b) x X That is, g(λ) is a lower bound on our optimization function. This motivates the dual problem to maximize g(λ) subject to λ Y, where Y = {λ R m : g(λ) > }. Theorem 2.5 (Duality). From (2.9), we see that the optimal value of the primal is always greater than the optimal value of the dual. This is weak duality. 2.3 Supporting Hyperplanes Fix b R m and consider φ as a function of c R m. Further consider the hyperplane given by α : R m R with α(c) = β λ T (b c) (2.10) Now, try to find φ(b) as follow.

9 mathematics of operations research 9 (i) For each λ, find β λ = max{β : α(c) φ(c), c R m } (2.11) (ii) Choose λ to maximize β λ Definition 2.6. Call α : R m R a supporting hyperplane to φ at b if α(c) = φ(b) λ T (b c) (2.12) and φ(c) φ(b) λ T (b c) (2.13) for all c R m. Theorem 2.7. The following are equivalent (i) There exists a (non-vertical) supporting hyperplane to φ at b, (ii) XXXXXXXXXXXXXXXXXXXXXXXXXX

10

11 3 Linear Programming 3.1 Convexity and Strong Duality Definition 3.1. Let S R n. S is convex if for all δ [0, 1], x, y S implies that δx + (1 δ)y S. f : S R is convex if for all x, y S and δ [0, 1], δ f (x) + (1 δ) f (y) f (δx + (1 δ)y. Visually, the area under the function is a convex set. Definition 3.2. x S is an interior point of S if there exists ɛ > 0 such {y : y x 2 ɛ} S. x S is an extreme point of S if for all y, z S and S (0, 1), x = δy + (1 δ)z implies x = y = z. Theorem 3.3 (Supporting Hyperplane). Suppose that our function φ is convex and b lies in the interior of the set of points where φ is finite. Then there is a (non-vertical) supporting hyperplane to φ at b. Theorem 3.4. Let X(b) = {x X : h(x) b}, φ(b) = inf x X(b) f (x). Then φ is convex if X, f, and h are convex. Proof. Let b 1, b 2 R m such that φ(b 1 ), φ(b 2 ) are defined. Let δ [0, 1] and b = δb 1 + (1 δ)b 2. Consider x 1 X(b 1 ), x 2 X(b 2 ) and let x = δx 1 + (1 δ)x 2.

12 12 andrew tulloch By convexity of Y, x X. By convexity of h, h(x) = h(δx 1 + (1 δ)x 2 ) δh(x 1 ) + (1 δ)h(x 2 ) δb 1 + (1 δ)b 2 = b Thus x X(b), and by convexity of f, φ(b) f (x) = f (δx 1 + (1 δ)x 2 ) δ f (x 1 ) + (1 δ) f (x 2 ) δφ(b 1 ) + (1 δ)φ(b 2 ) 3.2 Linear Programs Definition 3.5. General form of a linear program is min{c T x : Ax b, x 0} (3.1) Standard form of a linear program is min{c T x : Ax = b, x 0} (3.2) 3.3 Linear Program Duality Introduce slack variables to turn problem P into the form min{c T x : Ax z = b, x, z 0} (3.3) We have X = {(x, z) : x, z 0} R m+n. The Lagrangian is L((x, z), λ) = c T x λ T (Ax z b) (3.4) = (c T λ T A)x + λ T z + λ T b (3.5)

13 mathematics of operations research 13 and has a finite minimum if and only if λ Y = {λ : c T λ T A 0, λ 0} (3.6) For a fixed λ Y, the minimum of L is satisfied when (c T λ T A)x = 0 and λ T z = 0, and thus g(λ) = We obtain that the dual problem inf L((x, z), λ) = λ T b (3.7) (x,z) X max{λ T b : A T λ c, λ 0} (3.8) and it can be shown (exercise) that the dual of the dual of a linear program is the original program. 3.4 Complementary Slackness Theorem 3.6. Let x and λ be feasible for P and its dual. Then x and λ are optimal if and only if (c T λ T A)x = 0 (3.9) λ T (Ax b) = 0 (3.10) Proof. If x, λ are optimal, then c T x = λ T b }{{} by strong duality (3.11) = inf x X (ct x λ T (Ax b)) c T x λ T (Ax b) }{{} c T x primal and dual optimality Then the inequalities must be equalities. Thus (3.12) λ T b = c T x λ T (Ax b) = (c T λ T A)x +λ T b (3.13) }{{} =0 and c T x λ T (Ax b) = c T x (3.14) }{{} =0

14 14 andrew tulloch If (c T λ T A)x = 0 and λ T (Ax b) = 0, then c T x = c T λ T (Ax b) = (c T λ T a)x + λ T b = λ T b (3.15) and so by weak duality, x and λ are optimal.

15 4 Simplex Method 4.1 Basic Solutions Maximize c T x subject to Ax = b, x 0, A R m n. Call a solution x R n of Ax = b basic if it has at most m non-zero entries, that is, there exists B {1,..., n} with B = m and x i = 0 if i / B. A basic solution x with x 0 is called a basic feasible solution (bfs). 4.2 Extreme Points and Optimal Solutions We make the following assumptions: (i) The rows of A are linearly independent (ii) Every set of m columns of A are linearly independent. (iii) Every basic solution is non-degenerate - that is, it has exactly m non-zero entries. Theorem 4.1. x is a bfs of Ax = b if and only if it is an extreme point of the set X(b) = {x : Ax = b, x 0}. Theorem 4.2. If the problem has a finite optimum (feasible and bounded), then it has an optimal solution that is a bfs.

16 16 andrew tulloch 4.3 The Simplex Tableau Let A R m n, b R m. Let B be a basis (in the bfs sense), and x R n, such that Ax = b. Then A B x B + A N x N = b (4.1) where A B R m m and A N R m (n m) respectively consist of the columns of A indexed by B and those not indexed by B. Moreover, if x is a basic solution, then there is a basic B such that x N = 0 and A B x B = b, and if x is a basic feasible solution, there is a basis B such that x n = 0, A B x B = b, and x B 0. For every x with Ax = b and every basis B, we have x B = A 1 B (b A Nx N ) (4.2) as we assume that A B has full rank. Thus, f (x) = c T x = c T B x B + c T N x N (4.3) = c T B A 1 B (b A NX N ) + c T N x N (4.4) = CB T A 1 B b + (ct N ct B A 1 B A N)x N (4.5) Assume we can guarantee that A 1 B and x N = 0 is a bfs with b = 0. Then x with x B = A 1 B b f (x ) = C T B A 1 B b (4.6) Assume that we are maximizing c T x. There are two different cases: (i) If C T N CT B A 1 B x is optimal. A N 0, then f (x) c T B A 1 B b for every feasible x, so (ii) If (c T N ct B A 1 B A N) i > 0, then we can increase the objective value by increasing the corresponding row of (x N ) i. 4.4 The Simplex Method in Tableau Form

17 5 Advanced Simplex Procedures 5.1 The Two-Phase Simplex Method Finding an initial bfs is easy if the constraints have the form Ax = b where b 0, as Ax + z = b, (x, z) = (0, b) (5.1) 5.2 Gomory s Cutting Plane Method Used in integer programming (IP). This is a linear program where in addition some of the variables are required to be integral. Assume that for a given integer program we have found an optimal fractional solution x with basis B and let a ij = (A 1 B A j) and a i0 = (A 1 B b) be the entries of the final tableau. If x is not integral, the for some row i, a i0 is not integral. For every feasible solution x, x i = a ij x j x i + a ij x j = a i0. (5.2) j N j N If x is integral, then the left hand side is integral as well, and the inequality must still hold if the right hand side is rounded down. Thus, x i + a ij x j a i0. (5.3) j N Then, we can add this constraint to the problem and solve the augmented program. One can show that this procedure converges in a finite number of steps.

18

19 6 Complexity of Problems and Algorithms 6.1 Asymptotic Complexity We measure complexity as a function of input size. The input of a linear programming problem: c R n, A R m n, b R m is represented in (n + m n + m) k bits if we represent each number using k bits. For two functions f : R R and g : R R write f (n) = O(g(n)) (6.1) if there exists c, n 0 such that for all n n 0, f (n) c g(n) (6.2),... (similarly for Ω, and Θ (Ω + O))

20

21 7 The Complexity of Linear Programming 7.1 A Lower Bound for the Simplex Method Theorem 7.1. There exists a LP of size O(n 2 ), a pivoting rule, and an initial BFS such that the simplex method requires 2 n 1 iterations. Proof. Consider the unit cube in R n, given by constraints 0 x i 1 for i = 1,..., n. Define a spanning path inductively as follows. In dimension 1, go from x 1 = 0 to x 1 = 1. In dimension k, set x k = 0 and follow the path for dimension 1,..., k 1. Then set x = 1, and follow the path for dimension 1,..., k 1 backwards. The objective x n currently increases only once. Instead consider the perturbed unit cube given by the constraints ɛ x 1 1, ɛx i 1 x i 1 ɛx i 1 with ɛ (0, 1 2 ). 7.2 The Idea for a New Method min{c T x : Ax = b, x 0} (7.1) max{b T λ : A T λ c} (7.2) By strong duality, each of these problems has a bounded optimal

22 22 andrew tulloch solution if and only if the following set of constraints is feasible: c T x = b T λ (7.3) Ax = b (7.4) x 0 (7.5) A T λ c (7.6) It is thus enough to decide, for a given A R m n and b R m, whether {x R n : Ax b} =. Definition 7.2. A symmetric matrix D R n n is called positive definite if x T Dx > 0 for every x R n. Alternatively, all eigenvalues of the matrix are strictly positive. Definition 7.3. A set E R n given by E = E(z, D) = {x R n : (x z) T D(x z) 1} (7.7) for a positive definite symmetric D R n n and z R n is called an ellipsoid with center z. Let P = {x R n : Ax b} for some A R m n and b R m. To decide whether P =, we generate a sequence {E t } of ellipsoids E t with centers x t. If x t P, then P is non-empty and the method stops. If x t / P, then one of the constraints is violated - so there exists a row j of A such that a T j x t < b j. Therefore, P is contained in the half-space {x R n : a T j x at j x t, and in particular the intersection of this half-space with E t, which we will call a half-ellipsoid. Theorem 7.4. Let E = E(z, D) be an ellipsoid in R n and a R n = 0. Consider the half-space H = {x R n a T x a T z}, and let z = z + 1 n + 1 D = n2 n 2 1 Da a T Da ( D 2 n + 1 Daa T ) D a T Da (7.8) (7.9) Then D is symmetric and positive definite, and therefore E = E(z, D ) is an ellipsoid. Moreover, E H E and Vol(E ) < e 2(n+1) 1 Vol(E).

23 8 Graphs and Flows 8.1 Introduction Consider a directed graph (network) G = (V, E), Vthe set of vertices, E V V a set of edges. Undirected if E is symmetric. 8.2 Minimal Cost Flows Fill in this stuff from lectures

24

25 9 Transportation and Assignment Problems

26

27 10 Non-Cooperative Games Theorem 10.1 (von Neumann, 1928). Let P R m n. Then max min x X y Y p(x, y) = min max y Y x X p(x, y) (10.1)

28

29 11 Strategic Equilibrium Definition x X is a best response to y Y if p(x, y) = max x X p(x, y). (x, y) X Y is a Nash equilibrium if x is a best response to y and y is a best response to x. Theorem (x, y) X Y is an equilibrium of the matrix game P if and only if min p(x, y ) = max min p(x, y ) (11.1) y Y x X y Y and max p(x, y) = min max p(x, y ). (11.2) x X y Y x X Theorem Let (x, y), (x, y ) X Y be equilibria of the matrix game with payoff matrix P. Then p(x, y) = p(x, y ) and (x, y ) and (x, y) are equilibria as well. Theorem 11.4 (Nash, 1951). Every bimatrix game has an equilibrium.

30

31 12 Bibliography

THEORY OF SIMPLEX METHOD

THEORY OF SIMPLEX METHOD Chapter THEORY OF SIMPLEX METHOD Mathematical Programming Problems A mathematical programming problem is an optimization problem of finding the values of the unknown variables x, x,, x n that maximize

More information

Sample Final Examination Questions IE406 Introduction to Mathematical Programming Dr. Ralphs

Sample Final Examination Questions IE406 Introduction to Mathematical Programming Dr. Ralphs Sample Final Examination Questions IE0 Introduction to Mathematical Programming Dr. Ralphs. Consider the following linear programming problem and its optimal final tableau. min x x + x x + x + x 8 x +

More information

1 Lagrangian Methods. 1.2 The Dual Problem. 1.1 Lagrangian methods. 1.3 Strong Lagrangian. 1.4 Hyperplanes. The Dual Problem 2.

1 Lagrangian Methods. 1.2 The Dual Problem. 1.1 Lagrangian methods. 1.3 Strong Lagrangian. 1.4 Hyperplanes. The Dual Problem 2. The Dual Problem Lagrangian Methods. Lagrangian methods Let P(b) denote the optimization problem minimize f(x), subject to h(x) = b, x X. Let x X(b) = {x X : h(x) = b}. We say that x is feasible if x X(b).

More information

MATHEMATICAL BACKGROUND

MATHEMATICAL BACKGROUND Chapter 1 MATHEMATICAL BACKGROUND This chapter discusses the mathematics that is necessary for the development of the theory of linear programming. We are particularly interested in the solutions of a

More information

The two-phase simplex method

The two-phase simplex method Chapter 6 The two-phase simplex method We now deal with the first question raised at the end of Chapter 3. How do we find an initial basic feasible solution with which the simplex algorithm is started?

More information

We have the following important fact whose proof we leave as an exercise for the reader.

We have the following important fact whose proof we leave as an exercise for the reader. LP Geometry We now briefly turn to a discussion of LP geometry extending the geometric ideas developed in Section for dimensional LPs to n dimensions. In this regard, the key geometric idea is the notion

More information

UNIVERSITY OF OSLO. Faculty of Mathematics and Natural Sciences

UNIVERSITY OF OSLO. Faculty of Mathematics and Natural Sciences UNIVERSITY OF OSLO Faculty of Mathematics and Natural Sciences Examination in: MAT-INF3100 Linear Optimization Day of examination: Monday, June 1st, 2015 Examination hours: 09.00 13.00 This problem set

More information

11. Separating Hyperplane Theorems

11. Separating Hyperplane Theorems 11. Separating Hyperplane Theorems Daisuke Oyama Mathematics II May 16, 2016 Separating Hyperplane Theorem Proposition 11.1 (Separating Hyperplane Theorem) Suppose that B R N, B, is convex and closed,

More information

ISyE 6661: Topics Covered

ISyE 6661: Topics Covered ISyE 6661: Topics Covered 1. Optimization fundamentals: 1.5 lectures 2. LP Geometry (Chpt.2): 5 lectures 3. The Simplex Method (Chpt.3): 4 lectures 4. LP Duality (Chpt.4): 4 lectures 5. Sensitivity Analysis

More information

ME128 Computer-Aided Mechanical Design Course Notes Introduction to Design Optimization

ME128 Computer-Aided Mechanical Design Course Notes Introduction to Design Optimization ME128 Computer-ided Mechanical Design Course Notes Introduction to Design Optimization 2. OPTIMIZTION Design optimization is rooted as a basic problem for design engineers. It is, of course, a rare situation

More information

4.6 Linear Programming duality

4.6 Linear Programming duality 4.6 Linear Programming duality To any minimization (maximization) LP we can associate a closely related maximization (minimization) LP. Different spaces and objective functions but in general same optimal

More information

Introduction to Game Theory Matrix Games and Lagrangian Duality

Introduction to Game Theory Matrix Games and Lagrangian Duality Introduction to Game Theory Matrix Games and Lagrangian Duality 1. Introduction In this section we study only finite, two person, zero-sum, matrix games. We introduce the basics by studying a Canadian

More information

CSC Linear Programming and Combinatorial Optimization Lecture 7: von Neumann minimax theorem, Yao s minimax Principle, Ellipsoid Algorithm

CSC Linear Programming and Combinatorial Optimization Lecture 7: von Neumann minimax theorem, Yao s minimax Principle, Ellipsoid Algorithm D CSC2411 - Linear Programming and Combinatorial Optimization Lecture : von Neumann minimax theorem, Yao s minimax Principle, Ellipsoid Algorithm Notes taken by Xuming He March 4, 200 Summary: In this

More information

The Simplex Method. yyye

The Simplex Method.  yyye Yinyu Ye, MS&E, Stanford MS&E310 Lecture Note #05 1 The Simplex Method Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/

More information

Lagrangian Methods for Constrained Optimization

Lagrangian Methods for Constrained Optimization Appendix A Lagrangian Methods for Constrained Optimization A.1 Regional and functional constraints Throughout this book we have considered optimization problems that were subject to constraints. These

More information

Linear programming. Lecture 8. Kin Cheong Sou

Linear programming. Lecture 8. Kin Cheong Sou Lecture 8 Linear programming Kin Cheong Sou Department of Mathematical Sciences Chalmers University of Technology and Göteborg University November 17, 2013 Linear programs (LP) Formulation Consider a linear

More information

Linear & Quadratic Programming

Linear & Quadratic Programming Linear & Quadratic Programming Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Linear programming Norm minimization problems Dual linear programming

More information

Chap 8. Linear Programming and Game Theory

Chap 8. Linear Programming and Game Theory Chap 8. Linear Programming and Game Theory KAIST wit Lab 22. 7. 2 유인철 I. Linear Inequalities Introduction Linear Programming (from wikipedia) Mathematical method for determining a way to achieve the best

More information

Duality. Uri Feige. November 17, 2011

Duality. Uri Feige. November 17, 2011 Duality Uri Feige November 17, 2011 1 Linear programming duality 1.1 The diet problem revisited Recall the diet problem from Lecture 1. There are n foods, m nutrients, and a person (the buyer) is required

More information

Definition of a Linear Program

Definition of a Linear Program Definition of a Linear Program Definition: A function f(x 1, x,..., x n ) of x 1, x,..., x n is a linear function if and only if for some set of constants c 1, c,..., c n, f(x 1, x,..., x n ) = c 1 x 1

More information

CSCI 3210: Computational Game Theory Linear Programming and 2-Player Zero-Sum Games Ref: Wikipedia and [AGT] Ch 1

CSCI 3210: Computational Game Theory Linear Programming and 2-Player Zero-Sum Games Ref: Wikipedia and [AGT] Ch 1 CSCI 3210: Computational Game Theory Linear Programming and 2-Player Zero-Sum Games Ref: Wikipedia and [AGT] Ch 1 Mohammad T. Irfan Email: mirfan@bowdoin.edu Web: www.bowdoin.edu/~mirfan Course Website:

More information

Lecture 7 - Linear Programming

Lecture 7 - Linear Programming COMPSCI 530: Design and Analysis of Algorithms DATE: 09/17/2013 Lecturer: Debmalya Panigrahi Lecture 7 - Linear Programming Scribe: Hieu Bui 1 Overview In this lecture, we cover basics of linear programming,

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 12

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 12 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 12 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory May 10, 2012 Andre Tkacenko

More information

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 20: Consequences of the Ellipsoid Algorithm. Instructor: Shaddin Dughmi

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 20: Consequences of the Ellipsoid Algorithm. Instructor: Shaddin Dughmi CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 20: Consequences of the Ellipsoid Algorithm Instructor: Shaddin Dughmi Announcements Outline 1 Recapping the Ellipsoid Method 2 Complexity

More information

P = max(2x 1 + 3x 2 ) s.t. 4x 1 + 8x x 1 + x 2 3 3x 1 + 2x 2 4. In an attempt to solve P we can produce upper bounds on its optimal value.

P = max(2x 1 + 3x 2 ) s.t. 4x 1 + 8x x 1 + x 2 3 3x 1 + 2x 2 4. In an attempt to solve P we can produce upper bounds on its optimal value. Lecture 5 LP Duality In Lecture #3 we saw the Max-flow Min-cut Theorem which stated that the maximum flow from a source to a sink through a graph is always equal to the minimum capacity which needs to

More information

Lecture 11: Integer Linear Programs

Lecture 11: Integer Linear Programs Lecture 11: Integer Linear Programs 1 Integer Linear Programs Consider the following version of Woody s problem: x 1 = chairs per day, x 2 = tables per day max z = 35x 1 + 60x 2 profits 14x 1 + 26x 2 190

More information

Math 407 Definitions : Sections 1 3

Math 407 Definitions : Sections 1 3 Math 407 Definitions : Sections 1 3 Section 1 Mathematical Optimization: A mathematical optimization problem is one in which some real-valued function is either maximized or minimized relative to a given

More information

Transportation Problem. Two Set of Nodes Sources (m): commodities available for shipment Destinations (n): demand centers

Transportation Problem. Two Set of Nodes Sources (m): commodities available for shipment Destinations (n): demand centers Transportation Problem Two Set of Nodes Sources (m): commodities available for shipment Destinations (n): demand centers Inputs s i supply of commodity available at source i d j demand for commodity at

More information

Integer Programming Theory

Integer Programming Theory Integer Programming Theory Laura Galli November 11, 2014 In the following we assume all functions are linear, hence we often drop the term linear. In discrete optimization, we seek to find a solution x

More information

SPECIAL CASES IN APPLYING SIMPLEX METHODS

SPECIAL CASES IN APPLYING SIMPLEX METHODS Chapter 4 SPECIAL CASES IN APPLYING SIMPLEX METHODS 4.1 No Feasible Solutions In terms of the methods of artificial variable techniques, the solution at optimality could include one or more artificial

More information

Totally Unimodular Matrices

Totally Unimodular Matrices Totally Unimodular Matrices m constraints, n variables Vertex solution: unique solution of n linearly independent tight inequalities Can be rewritten as: That is: Totally Unimodular Matrices Assuming all

More information

Stanford University CS261: Optimization Handout 5 Luca Trevisan January 18, 2011

Stanford University CS261: Optimization Handout 5 Luca Trevisan January 18, 2011 Stanford University CS261: Optimization Handout 5 Luca Trevisan January 18, 2011 Lecture 5 In which we introduce linear programming. 1 Linear Programming A linear program is an optimization problem in

More information

UNIVERSITY OF CALICUT

UNIVERSITY OF CALICUT UNIVERSITY OF CALICUT SCHOOL OF DISTANCE EDUCATION BSc(MATHEMATICS)-VI SEMESTER-QUESTION BANK MAT6B13[E02]-LINEAR PROGRAMMING SHAMEEN M K Assistant Professor, Department of Mathematics, T M Govt. College,

More information

Geometry of Linear Programming

Geometry of Linear Programming Chapter 2 Geometry of Linear Programming The intent of this chapter is to provide a geometric interpretation of linear programming problems. To conceive fundamental concepts and validity of different algorithms

More information

By W.E. Diewert. July, Linear programming problems are important for a number of reasons:

By W.E. Diewert. July, Linear programming problems are important for a number of reasons: APPLIED ECONOMICS By W.E. Diewert. July, 3. Chapter : Linear Programming. Introduction The theory of linear programming provides a good introduction to the study of constrained maximization (and minimization)

More information

Convex Programming and Lagrange Duality

Convex Programming and Lagrange Duality Lecture 4 Convex Programming and Lagrange Duality (Convex Programming program, Convex Theorem on Alternative, convex duality Optimality Conditions in Convex Programming) 4.1 Convex Programming Program

More information

Algorithms for finding Nash Equilibria

Algorithms for finding Nash Equilibria Ethan Kim School of Computer Science McGill University Outline 1 Definition of bimatrix games 2 Simplifications 3 Setting up polytopes 4 Lemke-Howson algorithm 5 Lifting simplifications Bimatrix Games

More information

Numerical Methods I Mathematical Programming

Numerical Methods I Mathematical Programming Numerical Methods I Mathematical Programming Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course G63.2010.001 / G22.2420-001, Fall 2010 October 21st, 2010 A. Donev (Courant Institute)

More information

Lecture 9: Support Vector Machines

Lecture 9: Support Vector Machines Advanced Topics in Machine Learning: COMPGI13 Gatsby Unit, CSML, UCL March 15, 2016 Overview The representer theorem Review of convex optimization Support vector classification, the C-SV and ν-sv machines

More information

Theory of linear programming, and applications

Theory of linear programming, and applications Chapter 9 Theory of linear programming, and applications Convex sets, separating and supporting hyperplanes, linear programming, the separating hyperplanes theorem, duality, basic game theory, matrix games

More information

Introduction to Discrete Optimization

Introduction to Discrete Optimization Prof. Friedrich Eisenbrand Martin Niemeier Due Date: March 6, 9 Discussions: March 7, 9 Introduction to Discrete Optimization Spring 9 s 4 Exercise A company must deliver d i units of its product at the

More information

LINEAR PROGRAMMING. Vazirani Chapter 12 Introduction to LP-Duality. George Alexandridis(NTUA)

LINEAR PROGRAMMING. Vazirani Chapter 12 Introduction to LP-Duality. George Alexandridis(NTUA) LINEAR PROGRAMMING Vazirani Chapter 12 Introduction to LP-Duality George Alexandridis(NTUA) gealexan@mail.ntua.gr 1 LINEAR PROGRAMMING What is it? A tool for optimal allocation of scarce resources, among

More information

Linear programming I

Linear programming I Linear programming I Short introduction 1/39 Linear programming (LP) or linear optimization is a set of optimization algorithms. Short history: began in 1947 when George Dantzig devised the simplex method.

More information

7. Lecture notes on the ellipsoid algorithm

7. Lecture notes on the ellipsoid algorithm Massachusetts Institute of Technology Handout 19 18.433: Combinatorial Optimization May 4th, 009 Michel X. Goemans 7. Lecture notes on the ellipsoid algorithm The simplex algorithm was the first algorithm

More information

Algorithmic Game Theory and Applications. Lecture 4: 2-player zero-sum games, and the Minimax Theorem

Algorithmic Game Theory and Applications. Lecture 4: 2-player zero-sum games, and the Minimax Theorem Algorithmic Game Theory and Applications Lecture 4: 2-player zero-sum games, and the Mini Theorem Kousha Etessami 2-person zero-sum games A finite 2-person zero-sum (2p-zs) strategic game Γ, is a strategic

More information

1. Lecture notes on bipartite matching

1. Lecture notes on bipartite matching Massachusetts Institute of Technology Handout 3 18.433: Combinatorial Optimization February 9th, 2009 Michel X. Goemans 1. Lecture notes on bipartite matching Matching problems are among the fundamental

More information

Practical Guide to the Simplex Method of Linear Programming

Practical Guide to the Simplex Method of Linear Programming Practical Guide to the Simplex Method of Linear Programming Marcel Oliver Revised: April, 0 The basic steps of the simplex algorithm Step : Write the linear programming problem in standard form Linear

More information

Repeat definition of a degenerate vertex: It can be defined by more than one index set N. (A nongenerate vertex is defined by a unique set N.

Repeat definition of a degenerate vertex: It can be defined by more than one index set N. (A nongenerate vertex is defined by a unique set N. 2 VERTEX 1 Lecture Outline The following lecture covers Section 3.2 and 3.5 from the textbook [1] Define vertex (Def 3.2.1). Discuss how each vertex is defined by a set N {1, 2,..., n + m} with n elements.

More information

Linear Programming, Lagrange Multipliers, and Duality Geoff Gordon

Linear Programming, Lagrange Multipliers, and Duality Geoff Gordon lp.nb 1 Linear Programming, Lagrange Multipliers, and Duality Geoff Gordon lp.nb 2 Overview This is a tutorial about some interesting math and geometry connected with constrained optimization. It is not

More information

Support Vector machines

Support Vector machines Support Vector machines Arthur Gretton January 26, 206 Outline Review of convex optimization Support vector classification. C-SV and ν-sv machines 2 Review of convex optimization This review covers the

More information

Review of Part III: Solving LP problems

Review of Part III: Solving LP problems Review of Part III: Solving LP problems We learned how to solve LP problems using the simplex method and the two-phase method. We also studied basic feasible solutions and extreme points. By now, you should

More information

Integer Programming. Wolfram Wiesemann. November 17, 2009

Integer Programming. Wolfram Wiesemann. November 17, 2009 Integer Programming Wolfram Wiesemann November 7, 2009 Contents of this Lecture Mixed and Pure Integer Programming Problems Definitions Obtaining a Pure Integer Progamming Problem Examples Capital Budgeting

More information

4. Duality and Sensitivity

4. Duality and Sensitivity 4. Duality and Sensitivity For every instance of an LP, there is an associated LP known as the dual problem. The original problem is known as the primal problem. There are two de nitions of the dual pair

More information

INDR 262 Introduction to Optimization Methods

INDR 262 Introduction to Optimization Methods SENSITIVITY ANALYSIS Sensitivity analysis is carried out after the optimum solution of an LP problem is obtained. In most problems: -data are not known exactly -constraints may not be rigid -a new constraint

More information

56:272 Integer Programming & Network Flows Final Exam -- Fall 99

56:272 Integer Programming & Network Flows Final Exam -- Fall 99 56:272 Integer Programming & Network Flows Final Exam -- Fall 99 Write your name on the first page, and initial the other pages. Answer all the multiple-choice questions and X of the remaining questions.

More information

Minimize subject to. x S R

Minimize subject to. x S R Chapter 12 Lagrangian Relaxation This chapter is mostly inspired by Chapter 16 of [1]. In the previous chapters, we have succeeded to find efficient algorithms to solve several important problems such

More information

Find a counterexample that shows that this construction does not work for general norms. Solution. We use the l 1-norm, with

Find a counterexample that shows that this construction does not work for general norms. Solution. We use the l 1-norm, with τ = 0, v = 0, u 2 µ. The first two conditions give u = x 0, µ = t 0. The third condition implies x 0 2 t 0. In this case (x 0, t 0) is in the second-order cone, so it is its own projection. u 2 = µ > 0,

More information

Zero-Sum Games Public Strategies Minimax Theorem and Nash Equilibria Appendix. Zero-Sum Games. Algorithmic Game Theory.

Zero-Sum Games Public Strategies Minimax Theorem and Nash Equilibria Appendix. Zero-Sum Games. Algorithmic Game Theory. Public Strategies Minimax Theorem and Nash Equilibria Appendix 2012 Public Strategies Minimax Theorem and Nash Equilibria Appendix Definition For consistency with literature we here consider utility functions

More information

Lecture 10: The Geometry of Linear Programs. Murty Ch 3

Lecture 10: The Geometry of Linear Programs. Murty Ch 3 Lecture 10: The Geometry of Linear Programs Murty Ch 3 1 Linear Combinations Let S = {v 1,..., v n } be a set of vectors in R m. linear combination of S is any vector α 1 v 1 +... + α n v n where α 1,...,

More information

Duality for Mixed Integer Linear Programming

Duality for Mixed Integer Linear Programming Duality for Mixed Integer Linear Programming TED RALPHS MENAL GÜZELSOY ISE Department COR@L Lab Lehigh University ted@lehigh.edu University of Newcastle, Newcastle, Australia 13 May 009 Thanks: Work supported

More information

3.1 Simplex Method for Problems in Feasible Canonical Form

3.1 Simplex Method for Problems in Feasible Canonical Form Chapter SIMPLEX METHOD In this chapter, we put the theory developed in the last to practice We develop the simplex method algorithm for LP problems given in feasible canonical form and standard form We

More information

Lecture 2: Linear Programming

Lecture 2: Linear Programming Lecture 2: Linear Programming Example: The Diet Problem - Models, Theory, and Applications To determine the most economical diet that satisfies the basic minimum nutritional requirements for good health.

More information

Lecture notes 2 February 1, Convex optimization

Lecture notes 2 February 1, Convex optimization Lecture notes 2 February 1, 2016 Notation Convex optimization Matrices are written in uppercase: A, vectors are written in lowercase: a. A ij denotes the element of A in position (i, j), A i denotes the

More information

Math 5593 Linear Programming Midterm Exam

Math 5593 Linear Programming Midterm Exam Math 5593 Linear Programming Midterm Exam University of Colorado Denver, Fall 2011 Solutions (October 13, 2011) Problem 1 (Mathematical Problem Solving) [10 points] List the five major stages when solving

More information

A Brief Introduction to Duality Theory Paul J. Atzberger

A Brief Introduction to Duality Theory Paul J. Atzberger A Brief Introduction to Duality Theory Paul J. Atzberger Comments should be sent to: atzberg@math.ucsb.edu Abstract These notes give an introduction to duality theory in the context of linear and positive

More information

MATH3902 Operations Research II: Integer Programming Topics

MATH3902 Operations Research II: Integer Programming Topics MATH3902 Operations Research II Integer Programming p.0 MATH3902 Operations Research II: Integer Programming Topics 1.1 Introduction.................................................................. 1

More information

Constrained Optimization Using Lagrange Multipliers CEE 201L. Uncertainty, Design, and Optimization

Constrained Optimization Using Lagrange Multipliers CEE 201L. Uncertainty, Design, and Optimization Constrained Optimization Using Lagrange Multipliers CEE L. Uncertainty, Design, and Optimization Department of Civil and Environmental Engineering Duke University Henri P. Gavin and Jeffrey T. Scruggs

More information

Constrained Optimization

Constrained Optimization 1 / 22 Constrained Optimization ME555 Lecture (Max) Yi Ren Department of Mechanical Engineering, University of Michigan March 23, 2014 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange

More information

max cx s.t. Ax c where the matrix A, cost vector c and right hand side b are given and x is a vector of variables. For this example we have x

max cx s.t. Ax c where the matrix A, cost vector c and right hand side b are given and x is a vector of variables. For this example we have x Linear Programming Linear programming refers to problems stated as maximization or minimization of a linear function subject to constraints that are linear equalities and inequalities. Although the study

More information

The simplex method and the diameter of a 0-1 polytope

The simplex method and the diameter of a 0-1 polytope Department of Industrial Engineering and Management Techniicall Report No. 2012-3 The simplex method and the diameter of a 0-1 polytope Tomonari Kitahara and Shinji Mizuno May, 2012 Tokyo Institute of

More information

Primal-dual interior-point methods part I. Javier Peña (guest lecturer) Convex Optimization /36-725

Primal-dual interior-point methods part I. Javier Peña (guest lecturer) Convex Optimization /36-725 Primal-dual interior-point methods part I Javier Peña (guest lecturer) Convex Optimization 10-725/36-725 Last time: barrier method Approximate min subject to f(x) h i (x) 0, i = 1,... m Ax = b with min

More information

Lecture 3: Linear Programming Relaxations and Rounding

Lecture 3: Linear Programming Relaxations and Rounding Lecture 3: Linear Programming Relaxations and Rounding 1 Approximation Algorithms and Linear Relaxations For the time being, suppose we have a minimization problem. Many times, the problem at hand can

More information

Numerical Methods I Mathematical Programming (Optimization)

Numerical Methods I Mathematical Programming (Optimization) Numerical Methods I Mathematical Programming (Optimization) Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 23rd, 2014 A. Donev

More information

Discrete Optimization 2010 Lecture 4 Minimum-Cost Flows

Discrete Optimization 2010 Lecture 4 Minimum-Cost Flows Discrete Optimization 2010 Lecture 4 Minimum-Cost Flows Marc Uetz University of Twente m.uetz@utwente.nl Lecture 4: sheet 1 / 31 Marc Uetz Discrete Optimization Outline 1 Remarks on Max-Flow and Min-Cut

More information

13.1 Semi-Definite Programming

13.1 Semi-Definite Programming CS787: Advanced Algorithms Lecture 13: Semi-definite programming In this lecture we will study an extension of linear programming, namely semi-definite programming. Semi-definite programs are much more

More information

Linear Programming and Network Optimization

Linear Programming and Network Optimization Linear Programming and Network Optimization Jonathan Turner March 1, 1 Many of the problem we have been studying can be viewed as special cases of the more general linear programming problem (LP). In the

More information

Advanced Topics in Machine Learning (Part II)

Advanced Topics in Machine Learning (Part II) Advanced Topics in Machine Learning (Part II) 3. Convexity and Optimisation February 6, 2009 Andreas Argyriou 1 Today s Plan Convex sets and functions Types of convex programs Algorithms Convex learning

More information

Karush-Kuhn-Tucker conditions. Geoff Gordon & Ryan Tibshirani Optimization /

Karush-Kuhn-Tucker conditions. Geoff Gordon & Ryan Tibshirani Optimization / Karush-Kuhn-Tucker conditions Geoff Gordon & Ryan Tibshirani Optimization 10-725 / 36-725 1 Given a minimization problem Remember duality min x R n we defined the Lagrangian: f(x) subject to h i (x) 0,

More information

1 Polyhedra and Linear Programming

1 Polyhedra and Linear Programming CS 598CSC: Combinatorial Optimization Lecture date: January 21, 2009 Instructor: Chandra Chekuri Scribe: Sungjin Im 1 Polyhedra and Linear Programming In this lecture, we will cover some basic material

More information

1 Solving LPs: The Simplex Algorithm of George Dantzig

1 Solving LPs: The Simplex Algorithm of George Dantzig Solving LPs: The Simplex Algorithm of George Dantzig. Simplex Pivoting: Dictionary Format We illustrate a general solution procedure, called the simplex algorithm, by implementing it on a very simple example.

More information

CSC Linear Programming and Combinatorial Optimization Lecture 9: Ellipsoid Algorithm (Contd.) and Interior Point Methods

CSC Linear Programming and Combinatorial Optimization Lecture 9: Ellipsoid Algorithm (Contd.) and Interior Point Methods CSC2411 - Linear Programming and Combinatorial Optimization Lecture 9: Ellipsoid Algorithm (Contd.) and Interior Point Methods Notes taken by Vincent Cheung March 16, 2005 Summary: The ellipsoid algorithm

More information

Quiz 1 Sample Questions IE406 Introduction to Mathematical Programming Dr. Ralphs

Quiz 1 Sample Questions IE406 Introduction to Mathematical Programming Dr. Ralphs Quiz 1 Sample Questions IE406 Introduction to Mathematical Programming Dr. Ralphs These questions are from previous years and should you give you some idea of what to expect on Quiz 1. 1. Consider the

More information

17. Path-following methods

17. Path-following methods L. Vandenberghe EE236C (Spring 2016) 17. Path-following methods central path short-step barrier method predictor-corrector method 17-1 Introduction Primal-dual pair of conic LPs minimize subject to c T

More information

Lecture 3: The Simplex Method. Reading: Chapter 2

Lecture 3: The Simplex Method. Reading: Chapter 2 Lecture 3: The Simplex Method Reading: Chapter 2 1 The Simplex Tableau We start with the standard equality form LP max z = c 1 x 1 + c 2 x 2 + + c n x n a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1

More information

Polynomiality of Linear Programming

Polynomiality of Linear Programming Chapter 10 Polynomiality of Linear Programming In the previous section, we presented the Simplex Method. This method turns out to be very efficient for solving linear programmes in practice. While it is

More information

LECTURE 3: GEOMETRY OF LP. 1. Terminologies 2. Background knowledge 3. Graphic method 4. Fundamental theorem of LP

LECTURE 3: GEOMETRY OF LP. 1. Terminologies 2. Background knowledge 3. Graphic method 4. Fundamental theorem of LP LECTURE 3: GEOMETRY OF LP 1. Terminologies 2. Background knowledge 3. Graphic method 4. Fundamental theorem of LP Terminologies Baseline model: Feasible domain Feasible solution Consistency Terminologies

More information

Computational aspects of two-player zero-sum games Course notes for Computational Game Theory Sections 1 and 2 Fall 2010

Computational aspects of two-player zero-sum games Course notes for Computational Game Theory Sections 1 and 2 Fall 2010 Computational aspects of two-player zero-sum games Course notes for Computational Game Theory Sections 1 and 2 Fall 2010 Peter Bro Miltersen November 1, 2010 Version 1.2 1 Introduction The topic of this

More information

6.854 Advanced Algorithms. Scribes: Jay Kumar Sundararajan. Duality

6.854 Advanced Algorithms. Scribes: Jay Kumar Sundararajan. Duality Lecture 1: 10/13/2004 6.854 Advanced Algorithms Scribes: Jay Kumar Sundararajan Lecturer: David Karger Duality This lecture covers weak and strong duality, and also explains the rules for finding the dual

More information

18.310A lecture notes March 17, Linear programming

18.310A lecture notes March 17, Linear programming 18.310A lecture notes March 17, 2015 Linear programming Lecturer: Michel Goemans 1 Basics Linear Programming deals with the problem of optimizing a linear objective function subject to linear equality

More information

Interior-Point Methods for Inequality Constrained Optimisation

Interior-Point Methods for Inequality Constrained Optimisation Interior-Point Methods for Inequality Constrained Optimisation Lecture 10, Numerical Linear Algebra and Optimisation Oxford University Computing Laboratory, MT 2007 Dr Raphael Hauser (hauser@comlab.ox.ac.uk)

More information

E5295/5B5749 Convex optimization with engineering applications. Lecture 9. Interior methods

E5295/5B5749 Convex optimization with engineering applications. Lecture 9. Interior methods E5295/5B5749 Convex optimization with engineering applications Lecture 9 Interior methods A. Forsgren, KTH 1 Lecture 9 Convex optimization 2006/2007 Nonlinearly constrained convex program Consider a convex

More information

III. Solving Linear Programs by Interior-Point Methods

III. Solving Linear Programs by Interior-Point Methods Optimization Methods Draft of August 26, 25 III Solving Linear Programs by Interior-Point Methods Robert Fourer Department of Industrial Engineering and Management Sciences Northwestern University Evanston,

More information

Support vector machines (SVMs) Lecture 2

Support vector machines (SVMs) Lecture 2 Support vector machines (SVMs) Lecture 2 David Sontag New York University Slides adapted from Luke Zettlemoyer, Vibhav Gogate, and Carlos Guestrin Geometry of linear separators (see blackboard) A plane

More information

MATHEMATICS 694 Midterm Problems

MATHEMATICS 694 Midterm Problems MATHEMATICS 694 Midterm Problems. Let A be and m n matrix, b R m, and K R n a convex set. Prove or disprove that the set {x R n Ax b} K is a convex subset in R n. 2. Let f : R n R satisfies the inequality

More information

Good luck, veel succes!

Good luck, veel succes! Final exam Advanced Linear Programming, May 7, 13.00-16.00 Switch off your mobile phone, PDA and any other mobile device and put it far away. No books or other reading materials are allowed. This exam

More information

Simplex-Based Sensitivity Analysis and Dual C H A P T E R 6

Simplex-Based Sensitivity Analysis and Dual C H A P T E R 6 Simplex-Based Sensitivity Analysis and Dual 1 C H A P T E R 6 Range of Optimality of objective function 2 Say O.f: c 1 x 1 + c 2 x 2 The range of optimality for an objective function coefficient is the

More information

Linear Programming is the branch of applied mathematics that deals with solving

Linear Programming is the branch of applied mathematics that deals with solving Chapter 2 LINEAR PROGRAMMING PROBLEMS 2.1 Introduction Linear Programming is the branch of applied mathematics that deals with solving optimization problems of a particular functional form. A linear programming

More information

Nonlinear Programming Methods.S2 Quadratic Programming

Nonlinear Programming Methods.S2 Quadratic Programming Nonlinear Programming Methods.S2 Quadratic Programming Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard A linearly constrained optimization problem with a quadratic objective

More information

Chapter 2 Theory of Constrained Optimization

Chapter 2 Theory of Constrained Optimization Optimization I; Chapter 2 36 Chapter 2 Theory of Constrained Optimization 2.1 Basic notations and examples We consider nonlinear optimization problems (NLP) of the form minimize f(x) (2.1a) over x lr n

More information

Computing Equilibria for Two-Person Games

Computing Equilibria for Two-Person Games Computing Equilibria for Two-Person Games Bernhard von Stengel ETH Zürich November 25, 1996 (minor corrections added November 11, 1997) Abstract. This paper is a self-contained survey of algorithms for

More information