A Note on the Bertsimas & Sim Algorithm for Robust Combinatorial Optimization Problems



Similar documents
Proximal mapping via network optimization

A Robust Formulation of the Uncertain Set Covering Problem

A Branch and Bound Algorithm for Solving the Binary Bi-level Linear Programming Problem

Key words. Mixed-integer programming, mixing sets, convex hull descriptions, lot-sizing.

6.2 Permutations continued

Approximation Algorithms

Robust Geometric Programming is co-np hard

5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

Minimizing the Number of Machines in a Unit-Time Scheduling Problem

Fairness in Routing and Load Balancing

JUST-IN-TIME SCHEDULING WITH PERIODIC TIME SLOTS. Received December May 12, 2003; revised February 5, 2004

3. Linear Programming and Polyhedral Combinatorics

2.3 Convex Constrained Optimization Problems


24. The Branch and Bound Method

Duplicating and its Applications in Batch Scheduling

Sensitivity Analysis 3.1 AN EXAMPLE FOR ANALYSIS

How To Find An Optimal Search Protocol For An Oblivious Cell

Facility Location: Discrete Models and Local Search Methods

Solutions to Homework 6

Scheduling Shop Scheduling. Tim Nieberg

Minimally Infeasible Set Partitioning Problems with Balanced Constraints

Key words. multi-objective optimization, approximate Pareto set, bi-objective shortest path

How To Solve The Line Connectivity Problem In Polynomatix

Nan Kong, Andrew J. Schaefer. Department of Industrial Engineering, Univeristy of Pittsburgh, PA 15261, USA

No: Bilkent University. Monotonic Extension. Farhad Husseinov. Discussion Papers. Department of Economics

On the k-path cover problem for cacti

5.1 Bipartite Matching

Stationary random graphs on Z with prescribed iid degrees and finite mean connections

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh

Scheduling Single Machine Scheduling. Tim Nieberg

CHAPTER 9. Integer Programming

Stern Sequences for a Family of Multidimensional Continued Fractions: TRIP-Stern Sequences

Batch Scheduling of Deteriorating Products

LECTURE 5: DUALITY AND SENSITIVITY ANALYSIS. 1. Dual linear program 2. Duality theory 3. Sensitivity analysis 4. Dual simplex method

Steiner Tree Approximation via IRR. Randomized Rounding

Section Inner Products and Norms

The Price of Robustness

Applied Algorithm Design Lecture 5

A Robust Optimization Approach to Supply Chain Management

Linear Programming in Matrix Form

Approximating Minimum Bounded Degree Spanning Trees to within One of Optimal

The Goldberg Rao Algorithm for the Maximum Flow Problem

Solving polynomial least squares problems via semidefinite programming relaxations

International Doctoral School Algorithmic Decision Theory: MCDA and MOO

Lecture 4: BK inequality 27th August and 6th September, 2007

Chapter Load Balancing. Approximation Algorithms. Load Balancing. Load Balancing on 2 Machines. Load Balancing: Greedy Scheduling

Scheduling Parallel Jobs with Linear Speedup

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. #-approximation algorithm.

A Practical Scheme for Wireless Network Operation

Improved Approximation Algorithms for the Quality of Service Multicast Tree Problem

Improved Algorithms for Data Migration

1 The Line vs Point Test

Randomization Approaches for Network Revenue Management with Customer Choice Behavior

Dynamic TCP Acknowledgement: Penalizing Long Delays

Logistic Regression. Jia Li. Department of Statistics The Pennsylvania State University. Logistic Regression

Reducing Clocks in Timed Automata while Preserving Bisimulation

Definition Given a graph G on n vertices, we define the following quantities:

Recovery of primal solutions from dual subgradient methods for mixed binary linear programming; a branch-and-bound approach

COMBINATORIAL PROPERTIES OF THE HIGMAN-SIMS GRAPH. 1. Introduction

INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS

SEQUENCES OF MAXIMAL DEGREE VERTICES IN GRAPHS. Nickolay Khadzhiivanov, Nedyalko Nenov

Discrete Optimization

A Log-Robust Optimization Approach to Portfolio Management

ARTICLE IN PRESS. European Journal of Operational Research xxx (2004) xxx xxx. Discrete Optimization. Nan Kong, Andrew J.

On the Multiple Unicast Network Coding Conjecture

Some Problems of Second-Order Rational Difference Equations with Quadratic Terms

ON NONNEGATIVE SOLUTIONS OF NONLINEAR TWO-POINT BOUNDARY VALUE PROBLEMS FOR TWO-DIMENSIONAL DIFFERENTIAL SYSTEMS WITH ADVANCED ARGUMENTS

8.1 Min Degree Spanning Tree

Lecture 15 An Arithmetic Circuit Lowerbound and Flows in Graphs

Cost Efficient Network Synthesis from Leased Lines

GLOBAL OPTIMIZATION METHOD FOR SOLVING MATHEMATICAL PROGRAMS WITH LINEAR COMPLEMENTARITY CONSTRAINTS. 1. Introduction

Practical Guide to the Simplex Method of Linear Programming

Linear Programming. Widget Factory Example. Linear Programming: Standard Form. Widget Factory Example: Continued.

Weighted Sum Coloring in Batch Scheduling of Conflicting Jobs

Scheduling Parallel Jobs with Monotone Speedup 1

Numerical methods for American options

Math 4310 Handout - Quotient Vector Spaces

Diversity Coloring for Distributed Data Storage in Networks 1

Inner Product Spaces

Generating models of a matched formula with a polynomial delay

Classification - Examples

Best Monotone Degree Bounds for Various Graph Parameters

Near Optimal Solutions

Analysis of Approximation Algorithms for k-set Cover using Factor-Revealing Linear Programs

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. !-approximation algorithm.

Two-Stage Stochastic Linear Programs

Note on some explicit formulae for twin prime counting function

Integrating Benders decomposition within Constraint Programming

arxiv: v1 [math.co] 7 Mar 2012

RN-Codings: New Insights and Some Applications

Max-Min Representation of Piecewise Linear Functions

The chromatic spectrum of mixed hypergraphs

Duality in Linear Programming

An interval linear programming contractor

arxiv:math/ v2 [math.co] 5 Jul 2006

11. APPROXIMATION ALGORITHMS

The Advantages and Disadvantages of Online Linear Optimization

A NOTE ON INITIAL SEGMENTS OF THE ENUMERATION DEGREES

Differentiated Pricing of Urban Transportation Networks. Networks with Vehicle-Tracking Technologies

Transcription:

myjournal manuscript No. (will be inserted by the editor) A Note on the Bertsimas & Sim Algorithm for Robust Combinatorial Optimization Problems Eduardo Álvarez-Miranda Ivana Ljubić Paolo Toth Received: date / Accepted: date Abstract We improve the well-known result of Bertsimas and Sim presented in (D. Bertsimas and M. Sim, Robust discrete optimization and network flows, Mathematical Programg, B(98): 49-71, 2003) regarding the computation of optimal solutions of Robust Combinatorial Optimization problems with interval uncertainty in the objective function coefficients. We also extend this improvement to a more general class of Combinatorial Optimization problems with interval uncertainty. Keywords Robust Combinatorial Optimization Problems Bertsimas & Sim Algorithm. 1. Introduction and Motivation We address a general class of Combinatorial Optimization problems in which both the objective function coefficients and the constraint coefficients are subject to interval uncertainty. When uncertainty has to be taken into consideration, Robust Optimization (RO) arises as methodological alternative to deal with it. The Bertsimas & Sim Robust (B&S) Optimization approach, introduced in [Bertsimas and Sim(2003)], is one of the most important approaches devised to incorporate this E. Álvarez-Miranda, P. Toth DEIS, Università di Bologna Viale Risorgimento 2, 40136 Bologna, Italy E-mail: {e.alvarez,paolo.toth}@unibo.it I. Ljubić Department of Statistics and Operations Research, Faculty of Business, Economics, and Statistics, University of Vienna Brünnerstraße 72 A-1210 Vienna, Austria Tel: +43-1-4277-38661 Fax: +43-1-4277-38699 E-mail: ivana.ljubic@univie.ac.at

2 Eduardo Álvarez-Miranda et al. type of uncertainty into the decision process. By means of protection functions, the obtained solutions are endowed with protection, i.e., they are robust, in terms of feasibility and/or optimality for a given level of conservatism denoted by a parameter Γ X, defined by the decision maker. When the coefficients associated with a set of n variables are subject to uncertainty, the level of conservatism is interpreted as the number of coefficients that are expected to present uncertainty, i.e., 0 < Γ X n. For the case that the uncertain coefficients are only present in the objective function, a wellknown result of [Bertsimas and Sim(2003)] states that the robust counterpart of the problem can be computed by solving at most n + 1 instances of the original deteristic problem. Thus, the robust counterpart of a polynomially solvable binary optimization problem remains polynomially solvable. Our Contribution In this paper we propose some improvements and extensions to the algorithmic result presented in [Bertsimas and Sim(2003)]. For the case studied in their paper, we show that instead of solving n + 1 deteristic problems, the robust counterpart can be computed by solving n Γ X + 2 deteristic problems (Lemma 1); this improvement is particularly interesting for those cases for which a high level of conservatism, i.e., a large value of Γ X, is suitable. Additionally, we show that if a knapsack-type constraint is part of a problem and m of its coefficients are affected by uncertainty, an equivalent algorithmic approach can be applied, and the robust counterpart can be computed by solving m Γ Y + 2 deteristic problems (Lemma 2), for 0 < Γ Y m. Likewise, we show that if the uncertain coefficients in the objective function are associated with two disjoint sets of variables, of size n and m respectively, the robust problem can be computed by solving of (n Γ X + 2)(m Γ Y + 2) deteristic problems (Lemma 3), giving to the decision maker the flexibility to define different levels of conservatism to different sets of uncertain parameters. A similar result is also shown for the case that uncertainty is present in a set of n objective function coefficients and in a set of m coefficients of a knapsack-type constraint (Lemma 4). Combining the previous results, we provide a more general result which considers the case in which the uncertain coefficients in the objective function are associated with K disjoint sets of variables and there are L knapsack-type constraints (each of them involving a different set of variables) with uncertain coefficients. For this type of problems, we show that the robust counterpart can be computed by solving a strongly-polynomial number of deteristic problems (Theorem 1), assug that K and L are constant. The presented results are important when solving robust counterparts of some well-known combinatorial optimization problems in which different levels of conservatism are associated to disjoint subsets of binary variables. For example, in Prize-Collecting Network Design Problems (PCNDPs) (e.g., TSP, Steiner Trees), binary variables are associated to edges and nodes of a graph, and we might associate different levels of conservatism to their corresponding coefficients, costs and prizes, respectively. Besides defining the objective function as the sum of edge costs and node prizes, PC- NDPs are frequently modeled using knapsack-type Budget or Quota constraints, and our results can be used in these cases as well, when the coefficients of these constraints are subject to interval uncertainty.

A Note on the Bertsimas & Sim Algorithm for Robust Combinatorial Optimization Problems 3 Similarly, in facility location problems, location and allocation decisions need to be taken. Each of these decisions involves disjoint sets of variables and, possibly uncertain, coefficients. In these conditions, different levels of conservatism might be suitable for different sets of coefficients. Other proent examples of problems that fall into this framework are generalizations of the vehicle routing problem, involving routing, assignment, location, inventory decision variables and more for solving the robust counterparts of these problems, the presented result can be used as well. The viability of the proposed methods strongly relies on the efficacy to solve the deteristic counterparts. 2. Main Results Let us consider the following generic Combinatorial Optimization problem with linear objective function and binary variables x {0, 1} n : { } OP T P 1 = c i x i x Π, (P1) where c 0, I = {1, 2,..., n} and Π is a generic polyhedral region. Let us assume now that instead of having known and deteristic parameters c i, i I, we are actually given uncertain intervals [c i, c i + d i ], i I. Assume that variables x are ordered so that d i d i+1, i I, and d n+1 = 0. For a given level of conservatism Γ X {1,..., n}, the robust formulation of (P1) is defined in [Bertsimas and Sim(2003)] as: { } ROP T P 1 (Γ X ) = c i x i + βx (Γ X, x) x Π, (RP1) where βx (Γ X, x) is the corresponding protection function defined as: { βx (Γ X, x) = max d i x i u i } u i Γ X and u i [0, 1] i I. (1) This protection function endows robustness to the solutions in terms of protection of optimality in presence of a given level of data uncertainty, represented by Γ X. In the context of RO, (P1) is referred to as the noal problem and (RP1) as the corresponding robust counterpart. After applying strong duality to (1), problem (RP1) can be rewritten as ROP T P 1 (Γ X ) = c i x i + Γ X θ + h i (2) s.t. h i + θ d i x i, i I (3) h i 0, i I and θ 0 (4) x Π. (5)

4 Eduardo Álvarez-Miranda et al. The previous formulation of the robust counterpart of (P1) has been presented in [Bertsimas and Sim(2003)] and the authors provide a combinatorial framework that computes ROP T P 1 (Γ X ) by solving n+1 noal problems (Theorem 3, p. 56). The following lemma provides an improvement to this result by reducing the number of iterations of the algorithmic procedure. Lemma 1 Given Γ X {1,..., n}, the problem (RP1), the robust counterpart of problem (P1), can be computed by solving (n Γ X + 2) noal problems in the following scheme: where for r {Γ X,..., n + 1}: ROP T P 1 (Γ X ) = G r = Γ X d r + r {Γ X,...,n+1} Gr, ( c i x i + ) r (d i d r ) x i. Proof. The first part of the proof consists of showing that any optimal solution (x, h, θ ) of (RP1) satisfies: θ [0, d ΓX ]. Given the structure of constraints h i + θ d i x i, i I, it follows that any optimal solution (x, h, θ ) satisfies: h i = max (d i x i θ, 0), and since x i {0, 1}, then it is true that max (d i x i θ, 0) = max (d i θ, 0) x i. Therefore, the objective function of (2)-(5) can be rewritten as ROP T P 1 (Γ X ) = c i x i + Γ X θ + max (d i θ, 0) x i. Let x be a feasible solution for a given Γ X. Let N x be the set of indices i I such that x i = 1. Let I(N x, Γ X ) be a subset of N x associated with (at most) the Γ X largest d i values. Let us assume that N x Γ X, then we have I(N x, Γ X ) = N x, which implies that the cost of each element corresponding to an index i N x will be set to its corresponding upper bound c i + d i. This means that if x is optimal, the imum value ROP T P 1 (Γ X ) can be calculated as i N x (c i + d i ), which implies that θ = d n+1 = 0. Let us now assume that N x Γ X + 1. Then, by definition, we have I(N x, Γ X ) = Γ X. Let r be the index of the Γ X -th largest d i value taken into the solution, i.e., r = max{i i I(N x, Γ X )}. Then we have: c i + d i = c i + i N x i N x (N x,γ X ) = i N x c i + = c i x i + {i N x:i r } {i N x:i r } r d i d r + {i N x:i r } {i N x:i r } (d i d r ) + Γ X d r (d i d r )x i + Γ X d r. d r

A Note on the Bertsimas & Sim Algorithm for Robust Combinatorial Optimization Problems 5 Note that r Γ X since N x Γ X +1. Therefore, the imum value ROP T P 1 (Γ X ) will be reached for θ = d r, where r Γ X, and hence, θ [0, d ΓX ], which completes the first part of the proof. We now present the second part of the proof, where the previous result is plugged into the procedure devised in [Bertsimas and Sim(2003)], and we find the optimal values of θ by using an equivalent decomposition approach. We decompose the real interval [0, d ΓX ] into [0, d n ], [d n, d n 1 ],..., [d ΓX +1, d ΓX ]. Observe that for an arbitrary θ [d r, d r 1 ] we have: r 1 max(d i θ, 0)x i = (d i θ)x i. Therefore, ROP T P 1 (Γ X ) = r {ΓX,...,n+1} G r where for r {Γ X,..., n + 1} G r = r 1 c i x i + Γ X θ + (d i θ) x i, where θ [d r, d r 1 ] and x Π. Since we are optimizing a linear function of θ over the interval [d r, d r 1 ], the optimal value of G r is obtained either by θ = d r or by θ = d r 1. So, for r {Γ X,..., n + 1}: r 1 c i x i + G r = Γ X d r + (d i d r) x i, Γ X d r 1 + = Γ X d r + r c i x i + (d i d r) x i, Γ X d r 1 + r 1 c i x i + r 1 c i x i + (d i d r 1 ) x i (d i d r 1 ) x i. Therefore, ROP T P 1 (Γ X ) = Γ X d ΓX + c i x i,..., Γ X d r + r c i x i + (d i d r) x i,..., c i x i + d i x i, which completes the proof. Consider now the following problem that we will refer to as (P2): OP T P 2 = c i x i b j y j B and (x, y) Ψ, (P2) where y {0, 1} m are decision variables, B R 0 is a constant, b 0, J = {1, 2,..., m}, and Ψ is a generic polyhedral region. Let us assume that c is known with certainty, but instead, the elements of b are given as uncertain intervals [b j, b j + δ j ], j J, and that the variables are ordered so that δ j δ j+1, j J, and δ m+1 = 0. Given Γ Y {1,..., m}, the robust counterpart of the noal problem (P2), given the interval uncertainty of vector b, is: ROP T P 2 (Γ Y ) = c i x i b j y j + βy (Γ Y, y) B and (x, y) Ψ. (RP2)

6 Eduardo Álvarez-Miranda et al. In this case, β Y (Γ Y, y) provides protection of feasibility in presence of a level of conservatism given by Γ Y. This problem can be rewritten as ROP T P 2 (Γ Y ) = c i x i (6) s.t b j y j + Γ Y λ + k j B (7) k j + λ δ j y j, j J (8) k j 0, j J and λ 0 (9) (x, y) Ψ. (10) The following lemma extends for (RP2) the result of Theorem 3 in [Bertsimas and Sim(2003)], and adapts the result of Lemma 1. Lemma 2 Given Γ Y {1,..., m}, the problem (RP2), the robust counterpart of problem (P2), can be computed by solving (m Γ Y + 2) noal problems, in the following scheme: where for s {Γ Y,..., m + 1}: H s = ROP T P 2 (Γ Y ) = c i x i b j y j + s {Γ Y,...,m+1} Hs, s (δ j δ s ) y j + Γ Y δ s B. Proof. The core of the proof consists of showing that for any feasible solution of (6)-(10) we have λ [0, δ ΓY ]. For any feasible solution of (6)-(10) holds that k j = max (δ j y j λ, 0) ; thus, constraint (7) can be written as b j y j + Γ Y λ + max (δ j λ, 0) y j B. (11) Let (x, y) be a feasible solution for a given Γ X and a given Γ Y. Let M y be a set of indices j J such that y j = 1. Let J(M y, Γ Y ) be a subset of M y associated with (at most) the Γ Y largest values δ j. Since (x, y) is a feasible solution, then the following holds: b j + j M y δ j B. (M y,γ Y ) Let us assume that M y Γ Y, then we have J(M y, Γ Y ) = M y, which implies that the cost of each element corresponding to index j M y will be set to its corresponding upper bound b j + δ j, and hence constraint (11) is satisfied for λ = d m+1 = 0.

A Note on the Bertsimas & Sim Algorithm for Robust Combinatorial Optimization Problems 7 Let us now assume that M y Γ Y + 1. Then, by definition, we have J(M y, Γ Y ) = Γ Y. Let s = max{j j J(M y, Γ Y )}. So b j + δ j = b j + j M y j M y (M y,γ Y ) = j M y b j + = b j y j + δ j δ s + {j M y:j s } {j M y:j s } s {j M y:j s } {j M y:j s } (δ j δ s ) + Γ Y δ s (δ j δ s )y j + Γ Y δ s B. Note that s Γ Y since M y Γ Y + 1, and therefore constraint (7) will be satisfied for all λ = δ s such that s Γ Y. Therefore for any feasible solution we have λ [0, δ ΓY ]. By following similar arguments as those presented in the decomposition approach of the proof of Lemma 1, it holds that ROP T P 2 (Γ Y ) = c i x i c i x i b j y j + Γ Y δ ΓY B,..., s b j y j + (δ j δ s ) y j + Γ Y δ s B,..., c i x i b j y j + δ j y j B, and the proof is completed. We now present a second extension of the algorithm proposed in [Bertsimas and Sim(2003)]. Let us consider now the following noal problem: OP T P 3 = c i x i + b j y j (x, y) Ψ. (P3) In case that the elements of both vectors c and b are given in terms of closed intervals, the corresponding robust counterpart (for a pair (Γ X, Γ Y )) is given by ROP T P 3 (Γ X, Γ Y ) = c i x i + Γ X θ + h i + b j y j + Γ Y λ + k j (12) δ s s.t. (3),(4),(8),(9) and (x, y) Ψ. (13) The following result extends Lemma 1 and provides an algorithmic procedure to solve (12)-(13). Lemma 3 Given Γ X {1,..., n} and Γ Y {1,..., m}, the robust counterpart of problem (P3) can be computed by solving (n Γ X + 2)(m Γ Y + 2) noal problems as follows: ROP T P 3 (Γ X, Γ Y ) = r {Γ X,...,n+1} s {Γ Y,...,m+1} G r,s,

8 Eduardo Álvarez-Miranda et al. where for r {Γ X,..., n + 1} and s {Γ Y,..., m + 1}: G r,s = Γ X d r + Γ Y δ s + r c i x i + (d i d r) x i + s b j y j + (δ j δ s) y j. Proof. Using an analogous analysis to the one in the proofs of Lemma 1 and 2, we have that for any optimal solution (x, y, θ, λ ), it holds θ [0, d ΓX ] and λ [0, d ΓY ]. Then, by decomposition, the optimum can be found as ROP T P 3 (Γ X, Γ Y ) = r {ΓX,...,n+1} G r,s where for r {Γ X,..., n + 1} s {Γ Y,...,m+1} and s {Γ Y,..., m + 1} G r,s = r 1 c i x i + Γ X θ + (d i θ) x i + s 1 b j y j + Γ Y λ + (δ j λ) y j, (14) for which θ [d r, d r 1 ], λ [δ s, δ s 1 ] and (x, y) Ψ. Since we are optimizing a linear function of θ over the interval [d r, d r 1 ] and also a linear function for λ over the interval [δ s, δ s 1 ], the optimal value of G r,s is obtained for (θ, λ) {(d r, δ s ), (d r 1, δ s ), (d r, δ s 1 ), (d r 1, δ s 1 )}. So, for r {Γ X,..., n + 1} and s {Γ Y,..., m + 1}: G r,s = Γ X d r + Γ Y δ s + Γ X d r 1 + Γ Y δ s + Γ X d r + Γ Y δ s 1 + r 1 c ix i + r 1 c ix i + r 1 c ix i + (d i d r) x i + s 1 b jy j + (δ j δ s) y j, (d i d r 1) x i + s 1 b jy j + (δ j δ s) y j, (d i d r) x i + s 1 b jy j + (δ j δ s 1) y j, r 1 c ix i + s 1 b jy j + (δ j δ s 1) y j Γ X d r 1 + Γ Y δ s 1 + (d i d r 1) x i + = Γ X d r + Γ Y δ s + r c ix i + (d i d r) x i + s b jy j + (δ j δ s) y j, Γ X d r 1 + Γ Y δ s + r 1 c ix i + (d i d r 1) x i + s b jy j + (δ j δ s) y j, Γ X d r + Γ Y δ s 1 + r c ix i + (d i d r) x i + s 1 b jy j + (δ j δ s 1) y j, Γ X d r 1 + Γ Y δ s 1 + r 1 c ix i + (d i d r 1) x i + s 1 b jy j + (δ j δ s 1) y j.

A Note on the Bertsimas & Sim Algorithm for Robust Combinatorial Optimization Problems 9 Therefore, ROP T P 3 (Γ X, Γ Y ) = Γ X d ΓX + Γ Y δ ΓY + c i x i + b j y j,..., Γ X d r + Γ Y δ s + r c i x i + (d i d r ) x i + s b j y j + (δ j δ s ) y j,..., c i x i + which completes the proof. d i x i + b j y j + δ j y j, As a complementary result, one can observe that if in (P2) the cost vector c is also subject to interval uncertainty (along with the coefficient vector b), the corresponding robust counterpart is given by ROP T P 4 (Γ X, Γ Y ) = c i x i + Γ X θ + h i (15) s.t. (3), (4), (7), (8), (9) and (x, y) Ψ. (16) Combining the results of Lemma 1 and 2, we have the following result, Lemma 4 Given Γ X {1,..., n} and Γ Y {1,..., m}, the robust problem (15)-(16) can be solved by solving (n Γ X + 2)(m Γ Y + 2) noal problems as follows: ROP T P 4 (Γ X, Γ Y ) = where for r {Γ X,..., n + 1} and s {Γ Y,..., m + 1}: r {Γ X,...,n+1} s {Γ Y,...,m+1} H r,s, H r,s = Γ X d r + r c i x i + (d i d r) x i s b j y j + (δ j δ s) y j + Γ Y δ s B. We omit the proof of this result as it follows from the proofs of Lemma 2 and 3. 3. General Result In light of Lemmas 3 and 4, we now generalize the previous results considering a more general Combinatorial Optimization problem under interval uncertainty and propose a combinatorial framework to solve its robust counterpart. Let us consider a case in which the set of binary variables is partitioned into K + L subsets given by (x 1,..., x K, y 1,..., y L ), associated with sets of indices (I 1,..., I K, J 1,..., J L ). Variables (x 1,..., x K ) appear in the objective function with non-negative cost vectors (c 1,..., c K ), and (y 1,..., y L ) variables appear in L disjoint knapsack constraints with non-negative coefficients

10 Eduardo Álvarez-Miranda et al. (b 1,..., b L ) and non-negative right-hand-side bounds (B 1,..., B L ). Let Ψ be a generic polyhedron containing the feasibility conditions for (x 1,..., x K, y 1,..., y L ). With these elements we define noal problem (P5) as OP T P 5 = (x 1,...,y L ) Ψ c 1 i i +... + c K i xk i b 1 j y1 j B1,..., 1 K 1 L b L j yl j BL. (P5) We assume now that all elements of the cost vectors (c 1,..., c K ) and all elements of the knapsack coefficients (b 1,..., b L ) are subject to interval uncertainty; the cost coefficient of variable x k i is taken from [c k i, ck i + dk i ], for each i Ik and k K = {1,..., K}, and the coefficient of variable yj l is taken from [b l j, bl j + δl j ], for each j J l and l L = {1,..., L}. Assume that variables (x 1,..., y L ) are ordered so that d k i dk i+1 and dk I k +1 = 0, for all i Ik and k K, and δ l j δl j+1 and δl J l +1 = 0, for all j J l and l L. To each set of cost coefficients we associate a level of conservatism 0 Γ k X Ik, for all k K, and to each knapsack constraint we associate a level of conservatism 0 Γ l Y J l, for all l L. The following Theorem unifies the previous results. Theorem 1 For given 0 ΓX k Ik, for all k K, and 0 ΓY l J l, for all l L, the robust counterpart of (P5), ROP T P 5 (ΓX 1,..., Γ X K, Γ Y 1,..., Γ Y L ), can be computed by solving k K( I k ΓX k + 2) ( J l ΓY l + 2) l L problems given by ROP T P 5 (Γ 1 X,..., Γ K X, Γ 1 Y,..., Γ L Y ) = r 1 {Γ 1 X,..., I1 +1}. s L {Γ L Y,..., J L +1}. F (r1,...,r K,s 1,...,s L), where for r 1 {Γ 1 X,..., I1 +1},..., r K {Γ K X,..., IK +1} and s 1 {Γ 1 Y,..., J 1 +1},..., s L {Γ L Y,..., J L + 1}, we have that F (r1,...,r K,s 1,...,s L) =Γx 1 d r 1 +... + Γx K d r K + {ϕ 1 (r 1 ) +... + ϕ K (r K ) (x 1,...,y K ) Ψ ξ 1 (s 1 ) B 1,..., ξ L (s L ) B L }, such that and ϕ l (r k ) = k c k i x k i + r k ( d k i d k r k ) x k i, k K, ξ l (s l ) = b l jy l ( j + δ l j δ l ) s y l l j, l L. l s l

A Note on the Bertsimas & Sim Algorithm for Robust Combinatorial Optimization Problems 11 Proof. The robust counterpart of (P5) can be written as ROP T P 5 (Γ 1 X,..., Γ K X, Γ 1 Y,..., Γ L Y ) = k K c k i x k i + ΓXθ k k + h k i (17) k k s.t. b l jyj l + ΓY l λ l + kj l B l, l L (18) l l h k i + θ k d k i x k i and θ k 0, i I k, k K (19) kj l + λ l δjy l j l and λ l 0, l J l, l L (20) h k i 0, i I k, k K (21) kj l 0, j J l, l L. (22) From Lemma 1 and 2, one can show by mathematical induction that any optimal solution for (17)- (22) satisfies θ k [0, d k ], for each k K, and λ l [0, δ l ], for each l L. Finally, mathematical ΓX k ΓY l induction is applied to the previously used decomposition approach to derive the result for computing ROP T P 5 (Γ 1 X,..., Γ L Y ). As stressed in the Introduction, several Combinatorial Optimization problems are particular cases of (P5), and if interval uncertainty in their parameters is brought into play, the algorithmic procedure described by Theorem 1 could be an alternative for solving their robust counterparts. Acknowledgements I. Ljubić is supported by the APART Fellowship of the Austrian Academy of Sciences. This support is greatly acknowledged. E. Álvarez-Miranda thanks the Institute of Advanced Studies of the Università di Bologna from where he is a PhD Fellow. References [Bertsimas and Sim(2003)] D. Bertsimas and M. Sim. Robust discrete optimization and network flows. Mathematical Programg, B(98):49 71, 2003.