DANTZIG-WOLFE DECOMPOSITION WITH GAMS
|
|
|
- Clare Pearson
- 9 years ago
- Views:
Transcription
1 DANTZIG-WOLFE DECOMPOSITION WITH GAMS ERWIN KALVELAGEN Abstract. This document illustrates the Dantzig-Wolfe decomposition algorithm using GAMS. 1. Introduction Dantzig-Wolfe decomposition [2] is a classic solution approach for structured linear programming problems. In this document we will illustrate how Dantzig- Wolfe decomposition can be implemented in a GAMS environment. The GAMS language is rich enough to be able to implement fairly complex algorithms as is illustrated by GAMS implementations of Benders Decomposition [10], Cutting Stoc Column Generation [11] and branch-and-bound algorithms [12]. Dantzig-Wolfe decomposition has been an important tool to solve large structured models that could not be solved using a standard Simplex algorithm as they exceeded the capacity of those solvers. With the current generation of simplex and interior point LP solvers and the enormous progress in standard hardware (both in terms of raw CPU speed and availability of large amounts of memory) the Dantzig- Wolfe algorithm has become less popular. Implementations of the Dantzig-Wolfe algorithm have been described in [5, 6, 7]. Some renewed interest in decomposition algorithms was inspired by the availability of parallel computer architectures [8, 13]. A recent computational study is [16]. [9] discusses formulation issues when applying decomposition on multi-commodity networ problems. Many textboos on linear programming discuss the principles of the Dantzig-Wolfe decomposition [1, 14]. (1) Consider the LP: 2. Bloc-angular models min c T x Ax = b x 0 where A has a special structure: B 0 B 1 B 2... B K x 0 b 0 A 1 x 1 b 1 (2) Ax = A 2 x 2 = b AK x K b K Date: May,
2 2 ERWIN KALVELAGEN The constraints (3) K B x = b 0 =0 corresponding to the top row of sub-matrices are called the coupling constraints. The idea of the Dantzig-Wolfe approach is to decompose this problem, such that never a problem has to be solved with all sub-problems A x = b included. Instead a master problem is devised which only concentrates on the coupling constraints, and the sub-problems are solved individually. As a result only a series of smaller problems need to be solved. 3. Minowsi s Representation Theorem Consider the feasible region of an LP problem: (4) P = {x Ax = b, x 0} If P is bounded then we can characterize any point x P as a linear combination of its extreme points x () : (5) x = λ x () λ = 1 λ 0 If the feasible region can not assumed to be bounded we need to introduce the following: (6) x = λ x () + i λ = 1 λ 0 µ i 0 µ i r (i) where r (i) are the extreme rays of P. The above expression for x is sometimes called Minowsi s Representation Theorem[15]. The constraint λ = 1 is also nown as the convexity constraint. A more compact formulation is sometimes used: (7) where (8) δ = x = λ x () δ λ = 1 λ 0 { 1 if x () is an extreme point 0 if x () is an extreme ray
3 DANTZIG-WOLFE DECOMPOSITION WITH GAMS 3 I.e. we can describe the problem in terms of variables λ instead of the original variables x. In practice this reformulation can not be applied directly, as the number of variables λ becomes very large. 4. The decomposition The K subproblems are dealing with the constraints (9) A x = b x 0 while the Master Problem is characterized by the equations: (10) min c T x B x = b 0 x 0 0 We can substitute equation 7 into 10, resulting in: (11) K min c T 0 x 0 + p =1 =1 K B 0 x 0 + p =1 x 0 0 λ, 0 p =1 =1 (c T x () )λ, (B x () )λ, = b 0 δ, λ, = 1 for = 1,..., K This is a huge LP. Although the number of rows is reduced, the number of extreme points and rays x () of each subproblem is very large, resulting in an enormous number of variables λ,. However many of these variables will be non-basic at zero, and need not be part of the problem. The idea is that only variables with a promising reduced cost will be considered in what is also nown as a delayed column generation algorithm. The model with only a small number of the λ variables, compactly written as: min c T 0 x 0 + c T λ (12) B 0 x 0 + Bλ = b 0 λ = 1 x 0 0 λ 0 is called the restricted master problem. The missing variables are fixed at zero. The restricted master problem is not fixed in size: variables will be added to this problem during execution of the algorithm.
4 4 ERWIN KALVELAGEN Sub 1 Duals Sub 2 Master New columns. Sub K Figure 1. Communication between restricted master and sub-problems The attractiveness of a variable λ, can be measured by its reduced cost 1. If we denote the dual variables for constraint B 0 x 0 + Bλ = b 0 by π 1 and those for the convexity constraints δ,λ, = 1 by π() 2, then reduced cost for the master problem loo lie: ( (14) σ, = (c T x () ) B x () ) πt = (c T π1 T B )x () δ π() 2 δ,, Assuming the sub-problem to be bounded, the most attractive bfs (basic feasible solution) x to enter the master problem is found by maximizing the reduced cost giving the following LP: (15) min σ = (c T π1 T B )x π () x 2 B x x = b x 0 The operation to find these reduced costs is often called Pricing. If σ < 0 we can introduce the a new column λ, to the master, with a cost coefficient of c T x. A basic Dantzig-Wolfe decomposition algorithm can now be formulated: Dantzig-Wolfe decomposition algorithm. {initialization} Choose initial subsets of variables. while true do 1 The reduced cost of a variable x is (13) σ = c π T A where A is the column of A corresponding to variable x, and π are the duals.
5 DANTZIG-WOLFE DECOMPOSITION WITH GAMS 5 {Master problem} Solve the restricted master problem. π 1 := duals of coupling constraints π () 2 := duals of the th convexity constraint {Sub problems} for =1,...,K do Plug π 1 and π () 2 into sub-problem Solve sub-problem if σ < 0 then Add proposal x to the restricted master end if end for if No proposals generated then Stop: optimal end if end while (16) 5. Initialization We did not pay attention to the initialization of the decomposition. The first thing we can do is solve each sub-problem: min c T x A x = b x 0 If any of the subproblems is infeasible, the original problem is infeasible. Otherwise, we can use the optimal values x (or the unbounded rays) to generate an initial set of proposals. 6. Phase I/II algorithm The initial proposals may violate the coupling constraints. We can formulate a Phase I problem by introducing artificial variables and minimizing those. The use of artificial variables is explained in any textboo on Linear Programming (e.g. [1, 14]). It is noted that the reduced cost for a Phase I problem are slightly different from the Phase II problem. As an example consider that the coupling constraints are (17) x b We can add an artificial variable x a 0 as follows: (18) x x a b The phase I obective will be: (19) min x a The reduced cost of a variable x is now as in equation (14) but with c T = 0. It is noted that it is important to remove artificials once a phase II starts. We do this in the example code by fixing the artificial variables to zero.
6 6 ERWIN KALVELAGEN 7. Example: multi-commodity networ flow The multi-commodity networ flow (MCNF) problem can be stated as: min c i,x i, K (i,) A x i, x,i = b (20) (i,) A (,i) A x i, u i, K x i, 0 This is sometimes called the node-arc formulation. Dantzig-Wolfe decomposition is a well-nown solution strategy for this type of problems. For each commodity a subproblem is created. We consider here a multi-commodity transportation problem: min c i,x i, K (i,) x i, = supplyi (21) x i, = demand i x i, u i, K x i, 0 with data from [4]. A similar Dantzig-Wolfe decomposition algorithm written in AMPL can be found in [3]. Model dw.gms. 2 $ontext Dantzig-Wolfe Decomposition with GAMS Reference: Erwin Kalvelagen, April 2003 $offtext sets i origins /GARY, CLEV, PITT / destinations /FRA, DET, LAN, WIN, STL, FRE, LAF / p products /bands, coils, plate/ table supply(p,i) GARY CLEV PITT bands coils plate
7 DANTZIG-WOLFE DECOMPOSITION WITH GAMS 7 table demand(p,) FRA DET LAN WIN STL FRE LAF bands coils plate parameter limit(i,) limit(i,) = 625 table cost(p,i,) unit cost FRA DET LAN WIN STL FRE LAF BANDS.GARY BANDS.CLEV BANDS.PITT COILS.GARY COILS.CLEV COILS.PITT PLATE.GARY PLATE.CLEV PLATE.PITT direct LP formulation positive variable x(i,,p) shipments variable z obective variable equations ob supplyc(i,p) demandc(,p) limitc(i,) ob.. z =e= sum((i,,p), cost(p,i,)x(i,,p)) supplyc(i,p).. sum(, x(i,,p)) =e= supply(p,i) demandc(,p).. sum(i, x(i,,p)) =e= demand(p,) limitc(i,).. sum(p, x(i,,p)) =l= limit(i,) model m/all/ solve m minimizing z using lp subproblems positive variables xsub(i,) variables zsub parameters s(i) supply d() demand c(i,) cost coefficients pi1(i,) dual of limit pi2(p) dual of convexity constraint
8 8 ERWIN KALVELAGEN pi2p equations supply_sub(i) demand_sub() rc1_sub rc2_sub supply equation for single product demand equation for single product phase 1 obective phase 2 obective supply_sub(i).. sum(, xsub(i,)) =e= s(i) demand_sub().. sum(i, xsub(i,)) =e= d() rc1_sub.. zsub =e= sum((i,), -pi1(i,)xsub(i,)) - pi2p rc2_sub.. zsub =e= sum((i,), (c(i,)-pi1(i,))xsub(i,)) - pi2p model sub1 phase 1 subproblem /supply_sub, demand_sub, rc1_sub/ model sub2 phase 2 subproblem /supply_sub, demand_sub, rc2_sub/ master problem set proposal count /proposal1proposal1000/ set p(p,) p(p,) = no parameter proposal(i,,p,) parameter proposalcost(p,) proposal(i,,p,) = 0 proposalcost(p,) = 0 positive variables lambda(p,) excess artificial variable variable zmaster equations ob1_master phase 1 obective ob2_master phase 2 obective limit_master(i,) convex_master ob1_master.. ob2_master.. zmaster =e= excess zmaster =e= sum(p, proposalcost(p)lambda(p)) limit_master(i,).. sum(p, proposal(i,,p)lambda(p)) =l= limit(i,) + excess convex_master(p).. sum(p(p,), lambda(p,)) =e= 1 model master1 phase 1 master /ob1_master, limit_master, convex_master/ model master2 phase 2 master /ob2_master, limit_master, convex_master/ options to reduce solver output option limrow=0 option limcol=0 master1.solprint = 2 master2.solprint = 2 sub1.solprint = 2
9 DANTZIG-WOLFE DECOMPOSITION WITH GAMS 9 sub2.solprint = 2 options to speed up solver execution master1.solvelin = 2 master2.solvelin = 2 sub1.solvelin = 2 sub2.solvelin = 2 DANTZIG-WOLFE INITIALIZATION PHASE test subproblems for feasibility create initial set of proposals display " ", "INITIALIZATION PHASE", " " set () current proposal ( proposal1 ) = yes loop(p, solve subproblem, chec feasibility c(i,) = cost(p,i,) s(i) = supply(p,i) d() = demand(p,) pi1(i,) = 0 pi2p = 0 solve sub2 using lp minimizing zsub abort$(sub2.modelstat = 4) "SUBPROBLEM IS INFEASIBLE: ORIGINAL MODEL IS INFEASIBLE" abort$(sub2.modelstat <> 1) "SUBPROBLEM NOT SOLVED TO OPTIMALITY" proposal generation proposal(i,,p,) = xsub.l(i,) proposalcost(p,) = sum((i,), c(i,)xsub.l(i,)) p(p,) = yes () = (-1) ) option proposal:2:2:2 display proposal DANTZIG-WOLFE ALGORITHM while (true) do solve restricted master solve subproblems until no more proposals set iter maximum iterations /iter1iter15/ scalar done /0/ scalar count /0/ scalar phase /1/ scalar iteration loop(iter$(not done), iteration = ord(iter) display " ", iteration, " "
10 10 ERWIN KALVELAGEN solve master problem to get duals if (phase=1, solve master1 minimizing zmaster using lp abort$(master1.modelstat <> 1) "MASTERPROBLEM NOT SOLVED TO OPTIMALITY" if (excess.l < , display "Switching to phase 2" phase = 2 excess.fx = 0 ) ) if (phase=2, solve master2 minimizing zmaster using lp abort$(master2.modelstat <> 1) "MASTERPROBLEM NOT SOLVED TO OPTIMALITY" ) pi1(i,) = limit_master.m(i,) pi2(p) = convex_master.m(p) count = 0 loop(p$(not done), solve each subproblem c(i,) = cost(p,i,) s(i) = supply(p,i) d() = demand(p,) pi2p = pi2(p) if (phase=1, solve sub1 using lp minimizing zsub abort$(sub1.modelstat = 4) "SUBPROBLEM IS INFEASIBLE: ORIGINAL MODEL IS INFEASIBLE" abort$(sub1.modelstat <> 1) "SUBPROBLEM NOT SOLVED TO OPTIMALITY" else solve sub2 using lp minimizing zsub abort$(sub2.modelstat = 4) "SUBPROBLEM IS INFEASIBLE: ORIGINAL MODEL IS INFEASIBLE" abort$(sub2.modelstat <> 1) "SUBPROBLEM NOT SOLVED TO OPTIMALITY" ) proposal if (zsub.l < , count = count + 1 display "new proposal", count,xsub.l proposal(i,,p,) = xsub.l(i,) proposalcost(p,) = sum((i,), c(i,)xsub.l(i,)) p(p,) = yes () = (-1) ) ) no new proposals? abort$(count = 0 and phase = 1) "PROBLEM IS INFEASIBLE" done$(count = 0 and phase = 2) = 1 ) abort$(not done) "Out of iterations" recover solution
11 DANTZIG-WOLFE DECOMPOSITION WITH GAMS 11 parameter xsol(i,,p) xsol(i,,p) = sum(p(p,), proposal(i,,p)lambda.l(p)) display xsol parameter totalcost totalcost = sum((i,,p), cost(p,i,)xsol(i,,p)) display totalcost The reported solution is: PARAMETER xsol bands coils plate GARY.STL GARY.FRE GARY.LAF CLEV.FRA CLEV.DET CLEV.LAN CLEV.WIN CLEV.STL CLEV.FRE CLEV.LAF PITT.FRA PITT.DET PITT.LAN PITT.WIN PITT.STL PITT.FRE PITT.LAF PARAMETER totalcost = References 1. V. Chvatal, Linear programming, Freeman, G. B. Dantzig and P. Wolfe, Decomposition principle for linear programs, Operations Research 8 (1960), R. Fourer and D. Gay, Looping with ampl, R. Fourer, D. Gay, and B. Kernighan, AMPL: A modelling language for mathematical programming, Boyd & Fraser, J. K. Ho and E. Loute, An advanced implementation of the Dantzig-Wolfe decomposition algorithm for linear programming, Mathematical Programming 20 (1981), , Computational experience with advanced implementation of decomposition algorithms for linear programming, Mathematical Programming 27 (1983), James K. Ho and R. P. Sundarra, DECOMP: an implementation of Dantzig-Wolfe decomposition for linear programming, Lecture Notes in Economics and Mathematical Systems, vol. 338, Springer-Verlag, , Distributed nested decomposition of staircase linear programs, ACM Transactions on Mathematical Software (TOMS) 23 (1997), no. 2, K. L. Jones, I. J. Lustig, J. M. Farvolden, and W. B. Powell, Multicommodity networ flows: The impact of formulation on decomposition, Mathematical Programming 62 (1993), Erwin Kalvelagen, Benders decomposition with GAMS, com/pdf/benders.pdf, December , Column generation with GAMS, colgen.pdf, December , Branch-and-bound methods for an MINLP model with semi-continuous variables, April Deepanar Medhi, Decomposition of structured large-scale optimization problems and parallel optimization, Tech. Report 718, Computer Sciences Department, University Of Wisconsin, September 1987.
12 12 ERWIN KALVELAGEN 14. Katta G. Murty, Linear programming, Wiley, George L. Nemhauser and Laurence A. Wolsey, Integer and combinatorial optimization, Interscience Series in Discrete Mathematics and Optimization, Wiley, James Richard Tebboth, A computational study of Dantzig-Wolfe decomposition, Ph.D. thesis, University of Bucingham, December Amsterdam Optimization Modeling Group LLC, Washington D.C./The Hague address: [email protected]
BENDERS DECOMPOSITION FOR STOCHASTIC PROGRAMMING WITH GAMS
BENDERS DECOMPOSITION FOR STOCHASTIC PROGRAMMING WITH GAMS ERWIN KALVELAGEN Abstract. This document describes an implementation of Benders Decomposition for solving two-stage Stochastic Linear Programming
Minimizing costs for transport buyers using integer programming and column generation. Eser Esirgen
MASTER STHESIS Minimizing costs for transport buyers using integer programming and column generation Eser Esirgen DepartmentofMathematicalSciences CHALMERS UNIVERSITY OF TECHNOLOGY UNIVERSITY OF GOTHENBURG
Linear Programming. March 14, 2014
Linear Programming March 1, 01 Parts of this introduction to linear programming were adapted from Chapter 9 of Introduction to Algorithms, Second Edition, by Cormen, Leiserson, Rivest and Stein [1]. 1
Dantzig-Wolfe bound and Dantzig-Wolfe cookbook
Dantzig-Wolfe bound and Dantzig-Wolfe cookbook [email protected] DTU-Management Technical University of Denmark 1 Outline LP strength of the Dantzig-Wolfe The exercise from last week... The Dantzig-Wolfe
Optimization Theory for Large Systems
Optimization Theory for Large Systems LEON S. LASDON CASE WESTERN RESERVE UNIVERSITY THE MACMILLAN COMPANY COLLIER-MACMILLAN LIMITED, LONDON Contents 1. Linear and Nonlinear Programming 1 1.1 Unconstrained
Special Situations in the Simplex Algorithm
Special Situations in the Simplex Algorithm Degeneracy Consider the linear program: Maximize 2x 1 +x 2 Subject to: 4x 1 +3x 2 12 (1) 4x 1 +x 2 8 (2) 4x 1 +2x 2 8 (3) x 1, x 2 0. We will first apply the
IEOR 4404 Homework #2 Intro OR: Deterministic Models February 14, 2011 Prof. Jay Sethuraman Page 1 of 5. Homework #2
IEOR 4404 Homework # Intro OR: Deterministic Models February 14, 011 Prof. Jay Sethuraman Page 1 of 5 Homework #.1 (a) What is the optimal solution of this problem? Let us consider that x 1, x and x 3
24. The Branch and Bound Method
24. The Branch and Bound Method It has serious practical consequences if it is known that a combinatorial problem is NP-complete. Then one can conclude according to the present state of science that no
Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows
TECHNISCHE UNIVERSITEIT EINDHOVEN Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows Lloyd A. Fasting May 2014 Supervisors: dr. M. Firat dr.ir. M.A.A. Boon J. van Twist MSc. Contents
Linear Programming for Optimization. Mark A. Schulze, Ph.D. Perceptive Scientific Instruments, Inc.
1. Introduction Linear Programming for Optimization Mark A. Schulze, Ph.D. Perceptive Scientific Instruments, Inc. 1.1 Definition Linear programming is the name of a branch of applied mathematics that
A Branch and Bound Algorithm for Solving the Binary Bi-level Linear Programming Problem
A Branch and Bound Algorithm for Solving the Binary Bi-level Linear Programming Problem John Karlof and Peter Hocking Mathematics and Statistics Department University of North Carolina Wilmington Wilmington,
LECTURE 5: DUALITY AND SENSITIVITY ANALYSIS. 1. Dual linear program 2. Duality theory 3. Sensitivity analysis 4. Dual simplex method
LECTURE 5: DUALITY AND SENSITIVITY ANALYSIS 1. Dual linear program 2. Duality theory 3. Sensitivity analysis 4. Dual simplex method Introduction to dual linear program Given a constraint matrix A, right
1 Solving LPs: The Simplex Algorithm of George Dantzig
Solving LPs: The Simplex Algorithm of George Dantzig. Simplex Pivoting: Dictionary Format We illustrate a general solution procedure, called the simplex algorithm, by implementing it on a very simple example.
Specifying Data. 9.1 Formatted data: the data command
9 Specifying Data As we emphasize throughout this book, there is a distinction between an AMPL model for an optimization problem, and the data values that define a particular instance of the problem. Chapters
Duality in General Programs. Ryan Tibshirani Convex Optimization 10-725/36-725
Duality in General Programs Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: duality in linear programs Given c R n, A R m n, b R m, G R r n, h R r : min x R n c T x max u R m, v R r b T
INTEGER PROGRAMMING. Integer Programming. Prototype example. BIP model. BIP models
Integer Programming INTEGER PROGRAMMING In many problems the decision variables must have integer values. Example: assign people, machines, and vehicles to activities in integer quantities. If this is
Mathematical finance and linear programming (optimization)
Mathematical finance and linear programming (optimization) Geir Dahl September 15, 2009 1 Introduction The purpose of this short note is to explain how linear programming (LP) (=linear optimization) may
4.6 Linear Programming duality
4.6 Linear Programming duality To any minimization (maximization) LP we can associate a closely related maximization (minimization) LP. Different spaces and objective functions but in general same optimal
1 Introduction. Linear Programming. Questions. A general optimization problem is of the form: choose x to. max f(x) subject to x S. where.
Introduction Linear Programming Neil Laws TT 00 A general optimization problem is of the form: choose x to maximise f(x) subject to x S where x = (x,..., x n ) T, f : R n R is the objective function, S
Two-Stage Stochastic Linear Programs
Two-Stage Stochastic Linear Programs Operations Research Anthony Papavasiliou 1 / 27 Two-Stage Stochastic Linear Programs 1 Short Reviews Probability Spaces and Random Variables Convex Analysis 2 Deterministic
Chapter 6: Sensitivity Analysis
Chapter 6: Sensitivity Analysis Suppose that you have just completed a linear programming solution which will have a major impact on your company, such as determining how much to increase the overall production
Linear Programming in Matrix Form
Linear Programming in Matrix Form Appendix B We first introduce matrix concepts in linear programming by developing a variation of the simplex method called the revised simplex method. This algorithm,
International Doctoral School Algorithmic Decision Theory: MCDA and MOO
International Doctoral School Algorithmic Decision Theory: MCDA and MOO Lecture 2: Multiobjective Linear Programming Department of Engineering Science, The University of Auckland, New Zealand Laboratoire
A Constraint Programming based Column Generation Approach to Nurse Rostering Problems
Abstract A Constraint Programming based Column Generation Approach to Nurse Rostering Problems Fang He and Rong Qu The Automated Scheduling, Optimisation and Planning (ASAP) Group School of Computer Science,
Linear Programming. Widget Factory Example. Linear Programming: Standard Form. Widget Factory Example: Continued.
Linear Programming Widget Factory Example Learning Goals. Introduce Linear Programming Problems. Widget Example, Graphical Solution. Basic Theory:, Vertices, Existence of Solutions. Equivalent formulations.
Linear Programming Notes V Problem Transformations
Linear Programming Notes V Problem Transformations 1 Introduction Any linear programming problem can be rewritten in either of two standard forms. In the first form, the objective is to maximize, the material
GENERALIZED INTEGER PROGRAMMING
Professor S. S. CHADHA, PhD University of Wisconsin, Eau Claire, USA E-mail: [email protected] Professor Veena CHADHA University of Wisconsin, Eau Claire, USA E-mail: [email protected] GENERALIZED INTEGER
Discrete Optimization
Discrete Optimization [Chen, Batson, Dang: Applied integer Programming] Chapter 3 and 4.1-4.3 by Johan Högdahl and Victoria Svedberg Seminar 2, 2015-03-31 Todays presentation Chapter 3 Transforms using
Long-Term Security-Constrained Unit Commitment: Hybrid Dantzig Wolfe Decomposition and Subgradient Approach
IEEE TRANSACTIONS ON POWER SYSTEMS, VOL. 20, NO. 4, NOVEMBER 2005 2093 Long-Term Security-Constrained Unit Commitment: Hybrid Dantzig Wolfe Decomposition and Subgradient Approach Yong Fu, Member, IEEE,
Unit 1. Today I am going to discuss about Transportation problem. First question that comes in our mind is what is a transportation problem?
Unit 1 Lesson 14: Transportation Models Learning Objective : What is a Transportation Problem? How can we convert a transportation problem into a linear programming problem? How to form a Transportation
Chapter 2 Solving Linear Programs
Chapter 2 Solving Linear Programs Companion slides of Applied Mathematical Programming by Bradley, Hax, and Magnanti (Addison-Wesley, 1977) prepared by José Fernando Oliveira Maria Antónia Carravilla A
ON SOLVING THE PROGRESSIVE PARTY PROBLEM AS A MIP
ON SOLVING THE PROGRESSIVE PARTY PROBLEM AS A MIP ERWIN KALVELAGEN Abstract. The Progressive Party Problem [9] has long been considered a problem intractable for branch-and-bound mixed integer solvers.
Using the Simplex Method to Solve Linear Programming Maximization Problems J. Reeb and S. Leavengood
PERFORMANCE EXCELLENCE IN THE WOOD PRODUCTS INDUSTRY EM 8720-E October 1998 $3.00 Using the Simplex Method to Solve Linear Programming Maximization Problems J. Reeb and S. Leavengood A key problem faced
Integrating Benders decomposition within Constraint Programming
Integrating Benders decomposition within Constraint Programming Hadrien Cambazard, Narendra Jussien email: {hcambaza,jussien}@emn.fr École des Mines de Nantes, LINA CNRS FRE 2729 4 rue Alfred Kastler BP
Linear Programming. April 12, 2005
Linear Programming April 1, 005 Parts of this were adapted from Chapter 9 of i Introduction to Algorithms (Second Edition) /i by Cormen, Leiserson, Rivest and Stein. 1 What is linear programming? The first
Recovery of primal solutions from dual subgradient methods for mixed binary linear programming; a branch-and-bound approach
MASTER S THESIS Recovery of primal solutions from dual subgradient methods for mixed binary linear programming; a branch-and-bound approach PAULINE ALDENVIK MIRJAM SCHIERSCHER Department of Mathematical
Simplex method summary
Simplex method summary Problem: optimize a linear objective, subject to linear constraints 1. Step 1: Convert to standard form: variables on right-hand side, positive constant on left slack variables for
. P. 4.3 Basic feasible solutions and vertices of polyhedra. x 1. x 2
4. Basic feasible solutions and vertices of polyhedra Due to the fundamental theorem of Linear Programming, to solve any LP it suffices to consider the vertices (finitely many) of the polyhedron P of the
Noncommercial Software for Mixed-Integer Linear Programming
Noncommercial Software for Mixed-Integer Linear Programming J. T. Linderoth T. K. Ralphs December, 2004. Revised: January, 2005. Abstract We present an overview of noncommercial software tools for the
Nonlinear Programming Methods.S2 Quadratic Programming
Nonlinear Programming Methods.S2 Quadratic Programming Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard A linearly constrained optimization problem with a quadratic objective
Optimization Modeling for Mining Engineers
Optimization Modeling for Mining Engineers Alexandra M. Newman Division of Economics and Business Slide 1 Colorado School of Mines Seminar Outline Linear Programming Integer Linear Programming Slide 2
Practical Guide to the Simplex Method of Linear Programming
Practical Guide to the Simplex Method of Linear Programming Marcel Oliver Revised: April, 0 The basic steps of the simplex algorithm Step : Write the linear programming problem in standard form Linear
Chapter 6. Linear Programming: The Simplex Method. Introduction to the Big M Method. Section 4 Maximization and Minimization with Problem Constraints
Chapter 6 Linear Programming: The Simplex Method Introduction to the Big M Method In this section, we will present a generalized version of the simplex method that t will solve both maximization i and
Nonlinear Optimization: Algorithms 3: Interior-point methods
Nonlinear Optimization: Algorithms 3: Interior-point methods INSEAD, Spring 2006 Jean-Philippe Vert Ecole des Mines de Paris [email protected] Nonlinear optimization c 2006 Jean-Philippe Vert,
Linear Programming: Theory and Applications
Linear Programming: Theory and Applications Catherine Lewis May 11, 2008 1 Contents 1 Introduction to Linear Programming 3 1.1 What is a linear program?...................... 3 1.2 Assumptions.............................
Linear Programming I
Linear Programming I November 30, 2003 1 Introduction In the VCR/guns/nuclear bombs/napkins/star wars/professors/butter/mice problem, the benevolent dictator, Bigus Piguinus, of south Antarctica penguins
A Weighted-Sum Mixed Integer Program for Bi-Objective Dynamic Portfolio Optimization
AUTOMATYKA 2009 Tom 3 Zeszyt 2 Bartosz Sawik* A Weighted-Sum Mixed Integer Program for Bi-Objective Dynamic Portfolio Optimization. Introduction The optimal security selection is a classical portfolio
Optimized Scheduling in Real-Time Environments with Column Generation
JG U JOHANNES GUTENBERG UNIVERSITAT 1^2 Optimized Scheduling in Real-Time Environments with Column Generation Dissertation zur Erlangung des Grades,.Doktor der Naturwissenschaften" am Fachbereich Physik,
A Reference Point Method to Triple-Objective Assignment of Supporting Services in a Healthcare Institution. Bartosz Sawik
Decision Making in Manufacturing and Services Vol. 4 2010 No. 1 2 pp. 37 46 A Reference Point Method to Triple-Objective Assignment of Supporting Services in a Healthcare Institution Bartosz Sawik Abstract.
CPLEX Tutorial Handout
CPLEX Tutorial Handout What Is ILOG CPLEX? ILOG CPLEX is a tool for solving linear optimization problems, commonly referred to as Linear Programming (LP) problems, of the form: Maximize (or Minimize) c
Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams
Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams André Ciré University of Toronto John Hooker Carnegie Mellon University INFORMS 2014 Home Health Care Home health care delivery
5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1
5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1 General Integer Linear Program: (ILP) min c T x Ax b x 0 integer Assumption: A, b integer The integrality condition
Line Planning with Minimal Traveling Time
Line Planning with Minimal Traveling Time Anita Schöbel and Susanne Scholl Institut für Numerische und Angewandte Mathematik, Georg-August-Universität Göttingen Lotzestrasse 16-18, 37083 Göttingen, Germany
Solution of a Large-Scale Traveling-Salesman Problem
Chapter 1 Solution of a Large-Scale Traveling-Salesman Problem George B. Dantzig, Delbert R. Fulkerson, and Selmer M. Johnson Introduction by Vašek Chvátal and William Cook The birth of the cutting-plane
Transportation Polytopes: a Twenty year Update
Transportation Polytopes: a Twenty year Update Jesús Antonio De Loera University of California, Davis Based on various papers joint with R. Hemmecke, E.Kim, F. Liu, U. Rothblum, F. Santos, S. Onn, R. Yoshida,
Duality of linear conic problems
Duality of linear conic problems Alexander Shapiro and Arkadi Nemirovski Abstract It is well known that the optimal values of a linear programming problem and its dual are equal to each other if at least
Sensitivity Analysis 3.1 AN EXAMPLE FOR ANALYSIS
Sensitivity Analysis 3 We have already been introduced to sensitivity analysis in Chapter via the geometry of a simple example. We saw that the values of the decision variables and those of the slack and
26 Linear Programming
The greatest flood has the soonest ebb; the sorest tempest the most sudden calm; the hottest love the coldest end; and from the deepest desire oftentimes ensues the deadliest hate. Th extremes of glory
Lecture 3. Linear Programming. 3B1B Optimization Michaelmas 2015 A. Zisserman. Extreme solutions. Simplex method. Interior point method
Lecture 3 3B1B Optimization Michaelmas 2015 A. Zisserman Linear Programming Extreme solutions Simplex method Interior point method Integer programming and relaxation The Optimization Tree Linear Programming
DUAL METHODS IN MIXED INTEGER LINEAR PROGRAMMING
DUAL METHODS IN MIXED INTEGER LINEAR PROGRAMMING by Menal Guzelsoy Presented to the Graduate and Research Committee of Lehigh University in Candidacy for the Degree of Doctor of Philosophy in Industrial
Introduction to Linear Programming (LP) Mathematical Programming (MP) Concept
Introduction to Linear Programming (LP) Mathematical Programming Concept LP Concept Standard Form Assumptions Consequences of Assumptions Solution Approach Solution Methods Typical Formulations Massachusetts
Applied Algorithm Design Lecture 5
Applied Algorithm Design Lecture 5 Pietro Michiardi Eurecom Pietro Michiardi (Eurecom) Applied Algorithm Design Lecture 5 1 / 86 Approximation Algorithms Pietro Michiardi (Eurecom) Applied Algorithm Design
Network Models 8.1 THE GENERAL NETWORK-FLOW PROBLEM
Network Models 8 There are several kinds of linear-programming models that exhibit a special structure that can be exploited in the construction of efficient algorithms for their solution. The motivation
OPRE 6201 : 2. Simplex Method
OPRE 6201 : 2. Simplex Method 1 The Graphical Method: An Example Consider the following linear program: Max 4x 1 +3x 2 Subject to: 2x 1 +3x 2 6 (1) 3x 1 +2x 2 3 (2) 2x 2 5 (3) 2x 1 +x 2 4 (4) x 1, x 2
Integrated Multi-Echelon Supply Chain Design with Inventories under Uncertainty: MINLP Models, Computational Strategies
Integrated Multi-Echelon Supply Chain Design with Inventories under Uncertainty: MINLP Models, Computational Strategies Fengqi You, Ignacio E. Grossmann * Department of Chemical Engineering, Carnegie Mellon
Several Views of Support Vector Machines
Several Views of Support Vector Machines Ryan M. Rifkin Honda Research Institute USA, Inc. Human Intention Understanding Group 2007 Tikhonov Regularization We are considering algorithms of the form min
Proximal mapping via network optimization
L. Vandenberghe EE236C (Spring 23-4) Proximal mapping via network optimization minimum cut and maximum flow problems parametric minimum cut problem application to proximal mapping Introduction this lecture:
Solving Linear Programs
Solving Linear Programs 2 In this chapter, we present a systematic procedure for solving linear programs. This procedure, called the simplex method, proceeds by moving from one feasible solution to another,
Optimal Scheduling for Dependent Details Processing Using MS Excel Solver
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 8, No 2 Sofia 2008 Optimal Scheduling for Dependent Details Processing Using MS Excel Solver Daniela Borissova Institute of
Revenue Management for Transportation Problems
Revenue Management for Transportation Problems Francesca Guerriero Giovanna Miglionico Filomena Olivito Department of Electronic Informatics and Systems, University of Calabria Via P. Bucci, 87036 Rende
3. Linear Programming and Polyhedral Combinatorics
Massachusetts Institute of Technology Handout 6 18.433: Combinatorial Optimization February 20th, 2009 Michel X. Goemans 3. Linear Programming and Polyhedral Combinatorics Summary of what was seen in the
Abstract. 1. Introduction. Caparica, Portugal b CEG, IST-UTL, Av. Rovisco Pais, 1049-001 Lisboa, Portugal
Ian David Lockhart Bogle and Michael Fairweather (Editors), Proceedings of the 22nd European Symposium on Computer Aided Process Engineering, 17-20 June 2012, London. 2012 Elsevier B.V. All rights reserved.
Approximation Algorithms
Approximation Algorithms or: How I Learned to Stop Worrying and Deal with NP-Completeness Ong Jit Sheng, Jonathan (A0073924B) March, 2012 Overview Key Results (I) General techniques: Greedy algorithms
Batch Production Scheduling in the Process Industries. By Prashanthi Ravi
Batch Production Scheduling in the Process Industries By Prashanthi Ravi INTRODUCTION Batch production - where a batch means a task together with the quantity produced. The processing of a batch is called
LECTURE: INTRO TO LINEAR PROGRAMMING AND THE SIMPLEX METHOD, KEVIN ROSS MARCH 31, 2005
LECTURE: INTRO TO LINEAR PROGRAMMING AND THE SIMPLEX METHOD, KEVIN ROSS MARCH 31, 2005 DAVID L. BERNICK [email protected] 1. Overview Typical Linear Programming problems Standard form and converting
5.1 Bipartite Matching
CS787: Advanced Algorithms Lecture 5: Applications of Network Flow In the last lecture, we looked at the problem of finding the maximum flow in a graph, and how it can be efficiently solved using the Ford-Fulkerson
Duality in Linear Programming
Duality in Linear Programming 4 In the preceding chapter on sensitivity analysis, we saw that the shadow-price interpretation of the optimal simplex multipliers is a very useful concept. First, these shadow
Linear Program Solver
Linear Program Solver Help Manual prepared for Forest Management course by Bogdan Strimbu 1 Introduction The Linear Program Solver (LiPS) Help manual was developed to help students enrolled in the senior
Solving convex MINLP problems with AIMMS
Solving convex MINLP problems with AIMMS By Marcel Hunting Paragon Decision Technology BV An AIMMS White Paper August, 2012 Abstract This document describes the Quesada and Grossman algorithm that is implemented
Big Data Optimization at SAS
Big Data Optimization at SAS Imre Pólik et al. SAS Institute Cary, NC, USA Edinburgh, 2013 Outline 1 Optimization at SAS 2 Big Data Optimization at SAS The SAS HPA architecture Support vector machines
Integer Factorization using the Quadratic Sieve
Integer Factorization using the Quadratic Sieve Chad Seibert* Division of Science and Mathematics University of Minnesota, Morris Morris, MN 56567 [email protected] March 16, 2011 Abstract We give
Chapter 13: Binary and Mixed-Integer Programming
Chapter 3: Binary and Mixed-Integer Programming The general branch and bound approach described in the previous chapter can be customized for special situations. This chapter addresses two special situations:
An Improved Dynamic Programming Decomposition Approach for Network Revenue Management
An Improved Dynamic Programming Decomposition Approach for Network Revenue Management Dan Zhang Leeds School of Business University of Colorado at Boulder May 21, 2012 Outline Background Network revenue
A Note on the Bertsimas & Sim Algorithm for Robust Combinatorial Optimization Problems
myjournal manuscript No. (will be inserted by the editor) A Note on the Bertsimas & Sim Algorithm for Robust Combinatorial Optimization Problems Eduardo Álvarez-Miranda Ivana Ljubić Paolo Toth Received:
Creating a More Efficient Course Schedule at WPI Using Linear Optimization
Project Number: ACH1211 Creating a More Efficient Course Schedule at WPI Using Linear Optimization A Major Qualifying Project Report submitted to the Faculty of the WORCESTER POLYTECHNIC INSTITUTE in partial
Models in Transportation. Tim Nieberg
Models in Transportation Tim Nieberg Transportation Models large variety of models due to the many modes of transportation roads railroad shipping airlines as a consequence different type of equipment
Modeling and solving vehicle routing problems with many available vehicle types
MASTER S THESIS Modeling and solving vehicle routing problems with many available vehicle types SANDRA ERIKSSON BARMAN Department of Mathematical Sciences Division of Mathematics CHALMERS UNIVERSITY OF
Operation Research. Module 1. Module 2. Unit 1. Unit 2. Unit 3. Unit 1
Operation Research Module 1 Unit 1 1.1 Origin of Operations Research 1.2 Concept and Definition of OR 1.3 Characteristics of OR 1.4 Applications of OR 1.5 Phases of OR Unit 2 2.1 Introduction to Linear
Minimally Infeasible Set Partitioning Problems with Balanced Constraints
Minimally Infeasible Set Partitioning Problems with alanced Constraints Michele Conforti, Marco Di Summa, Giacomo Zambelli January, 2005 Revised February, 2006 Abstract We study properties of systems of
A Column-Generation and Branch-and-Cut Approach to the Bandwidth-Packing Problem
[J. Res. Natl. Inst. Stand. Technol. 111, 161-185 (2006)] A Column-Generation and Branch-and-Cut Approach to the Bandwidth-Packing Problem Volume 111 Number 2 March-April 2006 Christine Villa and Karla
Tutorial: Operations Research in Constraint Programming
Tutorial: Operations Research in Constraint Programming John Hooker Carnegie Mellon University May 2009 Revised June 2009 May 2009 Slide 1 Motivation Benders decomposition allows us to apply CP and OR
BIG DATA PROBLEMS AND LARGE-SCALE OPTIMIZATION: A DISTRIBUTED ALGORITHM FOR MATRIX FACTORIZATION
BIG DATA PROBLEMS AND LARGE-SCALE OPTIMIZATION: A DISTRIBUTED ALGORITHM FOR MATRIX FACTORIZATION Ş. İlker Birbil Sabancı University Ali Taylan Cemgil 1, Hazal Koptagel 1, Figen Öztoprak 2, Umut Şimşekli
GAMS, Condor and the Grid: Solving Hard Optimization Models in Parallel. Michael C. Ferris University of Wisconsin
GAMS, Condor and the Grid: Solving Hard Optimization Models in Parallel Michael C. Ferris University of Wisconsin Parallel Optimization Aid search for global solutions (typically in non-convex or discrete)
A numerically adaptive implementation of the simplex method
A numerically adaptive implementation of the simplex method József Smidla, Péter Tar, István Maros Department of Computer Science and Systems Technology University of Pannonia 17th of December 2014. 1
