Separation, Propagation, Heuristics

Size: px
Start display at page:

Download "Separation, Propagation, Heuristics"

Transcription

1 Separation, Propagation, Heuristics Selected tricks from an advanced MINLP solver Ambros Gleixner, Zuse Institute Berlin joint work with T Berthold, B Müller, F Serrano, R Schwarz, and S Weltge MINO / COST Spring School on Mixed Integer Nonlinear Programming and Applications Paris 8 April 2016

2 separation: tight outer approximation

3 context Several methods to solve convex mixed integer non-linear problems Outer approximation [Duran and Grossmann, 1986, Yuan et al, 1988, Fletcher and Leyffer, 1994] Extended cutting plane [Westerlund and Petterson, 1995] LP/NLP-based Branch-and-Bound [Quesada and Grossmann, 1992, Bonami et al, 2008, Abhishek et al, 2010, Achterberg, 2007, Berthold et al, 2009] Extended supporting hyperplane [Kronqvist et al, 2015] Crucial technique: outer approximate feasible region by linearization Ambros Gleixner, ZIB Separation, Propagation, Heuristics 2/48

4 the issue Example Consider the region defined by f(x) = x 2 1 and the point x = 2 that we want to separate 1 x 1 2 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 3/48

5 the issue Example Consider the region defined by f(x) = x 2 1 and the point x = 2 that we want to separate 1 ] x 1 2 Separating hyperplane: f(x ) + f(x )(x x ) (x 2) 1 x 5 4 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 3/48

6 the issue Example Consider the region defined by f(x) = x 2 1 and the point x = 2 that we want to separate f(x) 3 1 ] x 1 2 Separating hyperplane: f(x ) + f(x )(x x ) (x 2) 1 x 5 4 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 3/48

7 the issue Example f(x) = x x2 2 1 and the point (x 1, x 2 ) = ( 3 2, 3 2 ) x x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 4/48

8 the issue Example f(x) = x x2 2 1 and the point (x 1, x 2 ) = ( 3 2, 3 2 ) x x Separating hyperplane: f(x ) + f(x )(x x ) 1 x 1 + x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 4/48

9 the issue Example f(x) = x x2 2 1 and the point (x 1, x 2 ) = ( 3 2, 3 2 ) x x Separating hyperplane: f(x ) + f(x )(x x ) 1 x 1 + x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 4/48

10 the issue Example f(x) = x x2 2 1 and the point (x 1, x 2 ) = ( 3 2, 3 2 ) x x Separating hyperplane: f(x ) + f(x )(x x ) 1 x 1 + x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 4/48

11 overview We will take a constraint-wise perspective and focus on convex quadratic functions 3 approaches: 1 pushing the hyperplane 2 changing the reference point 3 changing the formulation Ambros Gleixner, ZIB Separation, Propagation, Heuristics 5/48

12 overview We will take a constraint-wise perspective and focus on convex quadratic functions 3 approaches: 1 pushing the hyperplane 2 changing the reference point 3 changing the formulation Ambros Gleixner, ZIB Separation, Propagation, Heuristics 5/48

13 overview We will take a constraint-wise perspective and focus on convex quadratic functions 3 approaches: 1 pushing the hyperplane 2 changing the reference point 3 changing the formulation Ambros Gleixner, ZIB Separation, Propagation, Heuristics 5/48

14 overview We will take a constraint-wise perspective and focus on convex quadratic functions 3 approaches: 1 pushing the hyperplane 2 changing the reference point 3 changing the formulation Ambros Gleixner, ZIB Separation, Propagation, Heuristics 5/48

15 pushing the hyperplane

16 pushing the hyperplane Region S = {x f(x) c} and x S, f convex x x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 7/48

17 pushing the hyperplane Region S = {x f(x) c} and Separating hyperplane: f( x) + f( x)(x x) c x S, f convex 1 x 2 1 x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 7/48

18 pushing the hyperplane Region S = {x f(x) c} and Separating hyperplane: Pushing the hyperplane: f( x) + f( x)(x x) c max f( x)x x Rn st f(x) c x S, f convex 1 x 2 1 x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 7/48

19 pushing the hyperplane Region S = {x f(x) c} and Separating hyperplane: Pushing the hyperplane: f( x) + f( x)(x x) c max f( x)x x Rn st f(x) c x S, f convex 1 x 2 1 x Relatively expensive in general, simple for quadratics f(x) = x T Ax + b T x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 7/48

20 pushing the hyperplane Region S = {x f(x) c} and Separating hyperplane: Pushing the hyperplane: f( x) + f( x)(x x) c max f( x)x x Rn st f(x) c x S, f convex 1 x 2 1 x Relatively expensive in general, simple for quadratics f(x) = x T Ax + b T x Problem: makes sense only for strictly convex quadratics, or convex quadratics such that b Range(A) Ambros Gleixner, ZIB Separation, Propagation, Heuristics 7/48

21 when is the separating hyperplane supporting? Given f representing S and x S: when is the separating hyperplane of f at x supporting? Ambros Gleixner, ZIB Separation, Propagation, Heuristics 8/48

22 when is the separating hyperplane supporting? Given f representing S and x S: when is the separating hyperplane of f at x supporting? Proposition f R n R convex function, S = {x R n f(x) c} Let x S and suppose f is differentiable at x Then, the valid inequality (for S) f( x) + f( x)(x x) c is a supporting hyperplane of S, if and only if there exists x 0 S such that f(x 0 + λ( x x 0 )) is affinely linear in λ In other words, if and only if, there is a segment joining x to the boundary of S and f is affinely linear in this segment Ambros Gleixner, ZIB Separation, Propagation, Heuristics 8/48

23 observations Functions with purely linear terms always satisfy this Example: x 2 y 0 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 9/48

24 observations Functions with purely linear terms always satisfy this Example: x 2 y 0 Convex quadratics f(x) = x T Ax + b T x, b Range(A) never satisfy this Example: x 2 2xy + y 2 1 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 9/48

25 changing the reference point

26 changing the reference point i Goal: move point x to be separated onto the boundary S Ambros Gleixner, ZIB Separation, Propagation, Heuristics 11/48

27 changing the reference point i Goal: move point x to be separated onto the boundary S Subgradient projection [Censor and Lent, 1980]: Compute separating hyperplane for x 0 = x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 11/48

28 changing the reference point i Goal: move point x to be separated onto the boundary S Subgradient projection [Censor and Lent, 1980]: Compute separating hyperplane for x 0 = x x 1 = projection of x 0 onto separating hyperplane and repeat Ambros Gleixner, ZIB Separation, Propagation, Heuristics 11/48

29 changing the reference point i Goal: move point x to be separated onto the boundary S Subgradient projection [Censor and Lent, 1980]: Compute separating hyperplane for x 0 = x x 1 = projection of x 0 onto separating hyperplane and repeat Defines the following iteration: x k+1 = x k f(x k) f(x k ) f(x k ) 2 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 11/48

30 changing the reference point i Goal: move point x to be separated onto the boundary S Subgradient projection [Censor and Lent, 1980]: Compute separating hyperplane for x 0 = x x 1 = projection of x 0 onto separating hyperplane and repeat Defines the following iteration: x k+1 = x k f(x k) f(x k ) f(x k ) 2 Stop after few iterations or until the distance from x 0 to the current hyperplane decreases Ambros Gleixner, ZIB Separation, Propagation, Heuristics 11/48

31 changing the reference point i Goal: move point x to be separated onto the boundary S Subgradient projection [Censor and Lent, 1980]: Compute separating hyperplane for x 0 = x x 1 = projection of x 0 onto separating hyperplane and repeat Defines the following iteration: x k+1 = x k λ k f(x k ) f(x k ) f(x k ) 2 Stop after few iterations or until the distance from x 0 to the current hyperplane decreases Variations [Haugazeau, 1968, Bauschke and Combettes, 2001] consider projecting x 0 over the hyperplane that separates x k Ambros Gleixner, ZIB Separation, Propagation, Heuristics 11/48

32 changing the reference point ii Goal: move point x to be separated onto the boundary S Line search towards an interior point necessity to compute interior point compare extended supporting hyperplane [Kronqvist et al, 2015]: line search with global interior point for all convex constraints Ambros Gleixner, ZIB Separation, Propagation, Heuristics 12/48

33 changing the formulation

34 is there a good representation? Such a representation should behave like a cone Ambros Gleixner, ZIB Separation, Propagation, Heuristics 14/48

35 is there a good representation? Such a representation should behave like a cone 0 S and consider S {1} R n+1 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 14/48

36 is there a good representation? Such a representation should behave like a cone 0 S and consider S {1} R n+1 Lines from origin to S Ambros Gleixner, ZIB Separation, Propagation, Heuristics 14/48

37 is there a good representation? Such a representation should behave like a cone 0 S and consider S {1} R n+1 Lines from origin to S Ambros Gleixner, ZIB Separation, Propagation, Heuristics 14/48

38 is there a good representation? Such a representation should behave like a cone 0 S and consider S {1} R n+1 Lines from origin to S Gauge function of S (0 S) φ S (x) = { inf t t > 0, x ts } Ambros Gleixner, ZIB Separation, Propagation, Heuristics 14/48

39 is there a good representation? Such a representation should behave like a cone 0 S and consider S {1} R n+1 Lines from origin to S Gauge function of S (0 S) φ S (x) = { inf t t > 0, x ts } The gauge φ S is convex, positively homogeneous (φ S (λx) = λφ S (x), λ > 0) and S = {x φ S (x) 1} Ambros Gleixner, ZIB Separation, Propagation, Heuristics 14/48

40 is there a good representation? Such a representation should behave like a cone 0 S and consider S {1} R n+1 Lines from origin to S Gauge function of S (0 S) φ S (x) = { inf t t > 0, x ts } The gauge φ S is convex, positively homogeneous (φ S (λx) = λφ S (x), λ > 0) and S = {x φ S (x) 1} φ S (x) measures the relative distance from x to S in the direction of x 1 x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 14/48

41 how to compute the gauge? Let S = {x f(x) c}, s 0 S The function that represents S is φ S,s0 (x) = φ S s0 (x s 0 ) Ambros Gleixner, ZIB Separation, Propagation, Heuristics 15/48

42 how to compute the gauge? Let S = {x f(x) c}, s 0 S The function that represents S is If φ S,s0 (x) = φ S s0 (x s 0 ) x S then there is σ (0, 1) such that f(s 0 + σ( x s 0 )) = c and φ S s0 ( x) = 1 σ Ambros Gleixner, ZIB Separation, Propagation, Heuristics 15/48

43 how to compute the gauge? Let S = {x f(x) c}, s 0 S The function that represents S is If φ S,s0 (x) = φ S s0 (x s 0 ) x S then there is σ (0, 1) such that f(s 0 + σ( x s 0 )) = c and φ S s0 ( x) = 1 σ Such σ can be found using Newton s method, binary search, etc Ambros Gleixner, ZIB Separation, Propagation, Heuristics 15/48

44 gauge function for quadratic sets Let f(x) = x T Ax + b T x, S = {x f(x) c} and c > 0, then φ S (x) = bt x + (b T x) 2 + 4cx T Ax 2c Ambros Gleixner, ZIB Separation, Propagation, Heuristics 16/48

45 gauge function for quadratic sets Let f(x) = x T Ax + b T x, S = {x f(x) c} and c > 0, then If s 0 S, after some algebra where φ S (x) = bt x + (b T x) 2 + 4cx T Ax 2c φ S,s0 (x) = bt φx f(s 0 ) c φ + (b T φx f(s 0 ) c φ ) 2 +4(c f(s 0 ))(f(x) b T φx+c φ ) 2(c f(s 0 )) b φ = b + 2As 0 c φ = s T 0 As 0 Just need to store a vector and a couple of reals Ambros Gleixner, ZIB Separation, Propagation, Heuristics 16/48

46 representation vs reference point Every optimal representation is a rule for changing points: Follows from when x S x s 0 φ S,s0 (x) + s 0 S This is nice, since for practical reasons we might not want to change the constraint f(x) c to φ(x) 1 (numerical tolerances, etc) Ambros Gleixner, ZIB Separation, Propagation, Heuristics 17/48

47 first computational results Is there any potential for some of these ideas? Setup Testset: 238 instances from MINLPLib2 with at least 1 convex quadratic constraint after presolve Compare SCIP 32 with gauge (only for quadratics) vs SCIP without gauge 2 hours time limit Strategy: use any interior point (via Ipopt) + treat linear part as constant Ambros Gleixner, ZIB Separation, Propagation, Heuristics 18/48

48 first computational results Is there any potential for some of these ideas? Setup Testset: 238 instances from MINLPLib2 with at least 1 convex quadratic constraint after presolve Compare SCIP 32 with gauge (only for quadratics) vs SCIP without gauge 2 hours time limit Strategy: use any interior point (via Ipopt) + treat linear part as constant Results +1 instance solved 108 instances solved by both of them 10% less nodes 13% speedup Ambros Gleixner, ZIB Separation, Propagation, Heuristics 18/48

49 propagation: obbt

50 bound tightening everywhere domain reduction procedures in many areas artificial intelligence, constraint programming, satisfiability testing, linear and integer programming global optimization and mixed-integer nonlinear programming general advantage: smaller domains smaller search space specifically for nonconvex MINLP branching on continuous variables/infinite domains tight domains tight relaxation Ambros Gleixner, ZIB Separation, Propagation, Heuristics 20/48

51 fbbt and obbt

52 feasibility-based bound tightening FBBT: Interval arithmetic on an expression graph [Messine 1997, Schichl and Neumaier 2005] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 22/48

53 feasibility-based bound tightening FBBT: Interval arithmetic on an expression graph [Messine 1997, Schichl and Neumaier 2005] Example x + 2 xy + 2 y [, 7] x 2 y 2xy + 3 y [0, 2] x, y [1, 16] [1,16] [1,16] x y [,7] + + [0,2] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 22/48

54 feasibility-based bound tightening FBBT: Interval arithmetic on an expression graph [Messine 1997, Schichl and Neumaier 2005] Example x + 2 xy + 2 y [, 7] x 2 y 2xy + 3 y [0, 2] x, y [1, 16] [1,16] [1,16] x y [1,4] [1,256] 2 [1,256] [1,4] forward propagation [1,16] 2 2 [1,1024] 2 3 [5,7] + + [0,2] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 22/48

55 feasibility-based bound tightening FBBT: Interval arithmetic on an expression graph [Messine 1997, Schichl and Neumaier 2005] Example x + 2 xy + 2 y [, 7] x 2 y 2xy + 3 y [0, 2] x, y [1, 16] [1,9] [1,16] x y [1,3] [1,256] 2 [1,16] [1,4] forward propagation backward propagation [1,4] 2 2 [1,511] 2 3 [5,7] + + [0,2] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 22/48

56 feasibility-based bound tightening FBBT: Interval arithmetic on an expression graph [Messine 1997, Schichl and Neumaier 2005] Example x + 2 xy + 2 y [, 7] x 2 y 2xy + 3 y [0, 2] forward propagation x, y [1, 16] backward propagation even for linear constraints, iterative FBBT may stall [see Belotti, Cafieri, Lee, Liberti 2010] [1,9] [1,16] x y [1,3] [1,256] 2 [1,16] [1,4] [5,7] [1,4] 2 2 [1,511] [0,2] 3 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 22/48

57 optimization-based bound tightening [l, u] min/max { x k g 1 (x),, g m (x) 0, x [l, u],? x j Z for j I} OBBT is a standard technique in nonconvex MINLP algorithms [Quesada and Grossmann 1993, Maranas and Floudas 1997, Smith and Pantelides 1999, Zamora and Grossmann 1999, Adjiman et al 1998, 2000, Nowak and Vigerske 2006, Belotti et al 2009, Caprara and Locatelli 2010, Misener and Floudas 2012, Gleixner and Weltge 2013, ] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 23/48

58 optimization-based bound tightening min/max { x k Ax b, x [l, u] } OBBT is a standard technique in nonconvex MINLP algorithms [Quesada and Grossmann 1993, Maranas and Floudas 1997, Smith and Pantelides 1999, Zamora and Grossmann 1999, Adjiman et al 1998, 2000, Nowak and Vigerske 2006, Belotti et al 2009, Caprara and Locatelli 2010, Misener and Floudas 2012, Gleixner and Weltge 2013, ] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 23/48

59 optimization-based bound tightening min/max { x k Ax b, c T x z, x [l, u] } OBBT is a standard technique in nonconvex MINLP algorithms [Quesada and Grossmann 1993, Maranas and Floudas 1997, Smith and Pantelides 1999, Zamora and Grossmann 1999, Adjiman et al 1998, 2000, Nowak and Vigerske 2006, Belotti et al 2009, Caprara and Locatelli 2010, Misener and Floudas 2012, Gleixner and Weltge 2013, ] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 23/48

60 optimization-based bound tightening min/max { x k Ax b, c T x z, x [l, u] } OBBT is a standard technique in nonconvex MINLP algorithms objective cutoff constraint optimality-based procedure [Quesada and Grossmann 1993, Maranas and Floudas 1997, Smith and Pantelides 1999, Zamora and Grossmann 1999, Adjiman et al 1998, 2000, Nowak and Vigerske 2006, Belotti et al 2009, Caprara and Locatelli 2010, Misener and Floudas 2012, Gleixner and Weltge 2013, ] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 23/48

61 optimization-based bound tightening min/max { x k Ax b, c T x z, x [l, u] } OBBT is a standard technique in nonconvex MINLP algorithms objective cutoff constraint optimality-based procedure [Quesada and Grossmann 1993, Maranas and Floudas 1997, Smith and Pantelides 1999, Zamora and Grossmann 1999, Adjiman et al 1998, 2000, Nowak and Vigerske 2006, Belotti et al 2009, Caprara and Locatelli 2010, Misener and Floudas 2012, Gleixner and Weltge 2013, ] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 23/48

62 optimization-based bound tightening min/max { x k Ax b, c T x z, x [l, u] } OBBT is a standard technique in nonconvex MINLP algorithms objective cutoff constraint optimality-based procedure exploits dependencies between (linear) constraints [Quesada and Grossmann 1993, Maranas and Floudas 1997, Smith and Pantelides 1999, Zamora and Grossmann 1999, Adjiman et al 1998, 2000, Nowak and Vigerske 2006, Belotti et al 2009, Caprara and Locatelli 2010, Misener and Floudas 2012, Gleixner and Weltge 2013, ] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 23/48

63 optimization-based bound tightening min/max { x k Ax b, c T x z, x [l, u] } OBBT is a standard technique in nonconvex MINLP algorithms objective cutoff constraint optimality-based procedure exploits dependencies between (linear) constraints polynomial, but comparatively expensive [Quesada and Grossmann 1993, Maranas and Floudas 1997, Smith and Pantelides 1999, Zamora and Grossmann 1999, Adjiman et al 1998, 2000, Nowak and Vigerske 2006, Belotti et al 2009, Caprara and Locatelli 2010, Misener and Floudas 2012, Gleixner and Weltge 2013, ] Ambros Gleixner, ZIB Separation, Propagation, Heuristics 23/48

64 bound filtering for obbt

65 bound filtering for obbt Simple observation If a LP-feasible solution x is tight at a bound, x {Ax b, c T x z, x [l, u]} and x k {l k, u k }, this bound cannot be tightened by OBBT x Ambros Gleixner, ZIB Separation, Propagation, Heuristics 25/48

66 bound filtering for obbt Simple observation If a LP-feasible solution x is tight at a bound, x {Ax b, c T x z, x [l, u]} and x k {l k, u k }, this bound cannot be tightened by OBBT x Save LP solves without bound tightening success simple filtering: inspect available solutions optimum of the LP relaxation (satisfies objective cutoff) OBBT-LPs encountered along the way Ambros Gleixner, ZIB Separation, Propagation, Heuristics 25/48

67 bound filtering for obbt Simple observation If a LP-feasible solution x is tight at a bound, x {Ax b, c T x z, x [l, u]} and x k {l k, u k }, this bound cannot be tightened by OBBT x Save LP solves without bound tightening success simple filtering: inspect available solutions optimum of the LP relaxation (satisfies objective cutoff) OBBT-LPs encountered along the way aggressive filtering: find solutions tight at many bounds minimize/maximize x i Ambros Gleixner, ZIB Separation, Propagation, Heuristics 25/48

68 bound filtering for obbt Simple observation If a LP-feasible solution x is tight at a bound, x {Ax b, c T x z, x [l, u]} and x k {l k, u k }, this bound cannot be tightened by OBBT x Save LP solves without bound tightening success simple filtering: inspect available solutions optimum of the LP relaxation (satisfies objective cutoff) 17% OBBT-LPs encountered along the way 37% aggressive filtering: find solutions tight at many bounds minimize/maximize x i 24% significantly reduces number of LP solves 78% Ambros Gleixner, ZIB Separation, Propagation, Heuristics 25/48

69 lagrangian variable bounds

70 lagrangian variable bounds A compact explanation for OBBT s bound tightening Ax b, λ 0 optimal min x k λ 1 λ 2 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 27/48

71 lagrangian variable bounds A compact explanation for OBBT s bound tightening Ax b, λ 0 optimal λ T Ax λ T b min x k λ 1 λ 2 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 27/48

72 lagrangian variable bounds A compact explanation for OBBT s bound tightening Ax b, λ 0 optimal λ T Ax λ T b i a i x i λ T b min x k λ 1 λ 2 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 27/48

73 lagrangian variable bounds A compact explanation for OBBT s bound tightening Ax b, λ 0 optimal λ T Ax λ T b i a i x i λ T b x k (1 a k )x k a i x i + λ T b i k min x k λ 1 λ 2 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 27/48

74 lagrangian variable bounds A compact explanation for OBBT s bound tightening Ax b, λ 0 optimal λ T Ax λ T b i a i x i λ T b x k (1 a k )x k a i x i + λ T b x k r i x i + λ T b i i k min x k λ 2 λ 1 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 27/48

75 lagrangian variable bounds A compact explanation for OBBT s bound tightening Ax b, λ 0 optimal λ T Ax λ T b i a i x i λ T b x k (1 a k )x k a i x i + λ T b x k r i x i + λ T b i i k min x k λ 2 λ 1 if l k is tightened, then a k = 1 r k = 0 (complementary slackness) Ambros Gleixner, ZIB Separation, Propagation, Heuristics 27/48

76 lagrangian variable bounds A compact explanation for OBBT s bound tightening Ax b, λ 0 optimal λ T Ax λ T b i a i x i λ T b x k (1 a k )x k a i x i + λ T b x k r i x i + λ T b i i k min x k λ 2 λ 1 if l k is tightened, then a k = 1 r k = 0 (complementary slackness) with multiplier μ 0 for objective cutoff constraint: x k r i x i + μz + λ T b i (LVB) Ambros Gleixner, ZIB Separation, Propagation, Heuristics 27/48

77 lagrangian variable bounds A compact explanation for OBBT s bound tightening Ax b, λ 0 optimal λ T Ax λ T b i a i x i λ T b x k (1 a k )x k a i x i + λ T b x k r i x i + λ T b i i k min x k λ 2 λ 1 if l k is tightened, then a k = 1 r k = 0 (complementary slackness) with multiplier μ 0 for objective cutoff constraint: x k r i l i + r i u i + μz + λ T b i r i >0 i r i <0 (LVB) Ambros Gleixner, ZIB Separation, Propagation, Heuristics 27/48

78 propagating lagrangian variable bounds The right-hand side of x k r T x + μz + λ T b becomes tighter if some l i increases for r i > 0 if some u i decreases for r i < 0 if a better primal solution is found and μ < 0 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 28/48

79 propagating lagrangian variable bounds The right-hand side of x k r T x + μz + λ T b becomes tighter if some l i increases for r i > 0 if some u i decreases for r i < 0 if a better primal solution is found and μ < 0 Learn LVBs during root OBBT and propagate again locally at nodes of the branch-and-bound tree globally if a better primal solution is found compare duality-based reduction [Tawarmalani and Sahinidis 2004] This promises a computationally cheap approximation of OBBT in the tree Ambros Gleixner, ZIB Separation, Propagation, Heuristics 28/48

80 computational experiments

81 computational experiments How many nontrivial LVBs can be generated? 211 instances from MINLPLib full OBBT for variables in nonlinear constraints, once at root node count LVBs with some r i 0, i k, or μ % 20% 40% 60% 80% 100% Histogram: rate of generated LVBs/OBBT LP distributed over test set always 15%, 132 times 50% Ambros Gleixner, ZIB Separation, Propagation, Heuristics 30/48

82 examples SCIP> read LiCrudeOil_ex03 SCIP> optimize SCIP> display statistics Propagators : #Propagate #ResProp Cutoffs DomReds genvbounds : obbt : probing : pseudoobj : redcost : rootredcost : vbounds : Propagator Timings : TotalTime SetupTime Propagate ResProp genvbounds : obbt : probing : pseudoobj : redcost : rootredcost : vbounds : Ambros Gleixner, ZIB Separation, Propagation, Heuristics 31/48

83 examples SCIP> read kallrath_circlesc6ax SCIP> optimize SCIP> display statistics Propagators : #Propagate #ResProp Cutoffs DomReds genvbounds : obbt : probing : pseudoobj : redcost : rootredcost : vbounds : Propagator Timings : TotalTime SetupTime Propagate ResProp genvbounds : obbt : probing : pseudoobj : redcost : rootredcost : vbounds : Ambros Gleixner, ZIB Separation, Propagation, Heuristics 32/48

84 propagating lagrangian variable bounds only propagate from right-hand side to left-hand side variable x 3 x 1 2x 2 x 4 x 5 x 4 x 3 + 7z Ambros Gleixner, ZIB Separation, Propagation, Heuristics 33/48

85 propagating lagrangian variable bounds only propagate from right-hand side to left-hand side variable x l 3 x 1 2x 2 x 4 build a directed dependency graph 1 u 2 of lower bounds l 1,, l n, upper z bounds u 1,, u n, and incumbent z l 3 u 4 u 5 x 5 x 4 x 3 + 7z Ambros Gleixner, ZIB Separation, Propagation, Heuristics 33/48

86 propagating lagrangian variable bounds only propagate from right-hand side to left-hand side variable x l 3 x 1 2x 2 x 4 build a directed dependency graph 1 u 2 of lower bounds l 1,, l n, upper z bounds u 1,, u n, and incumbent z l 3 u 4 u 5 x 5 x 4 x 3 + 7z Ambros Gleixner, ZIB Separation, Propagation, Heuristics 33/48

87 propagating lagrangian variable bounds only propagate from right-hand side to left-hand side variable x l 3 x 1 2x 2 x 4 build a directed dependency graph 1 u 2 of lower bounds l 1,, l n, upper z bounds u 1,, u n, and incumbent z l 3 u 4 u 5 x 5 x 4 x 3 + 7z Ambros Gleixner, ZIB Separation, Propagation, Heuristics 33/48

88 propagating lagrangian variable bounds only propagate from right-hand side to left-hand side variable x l 3 x 1 2x 2 x 4 build a directed dependency graph 1 u 2 of lower bounds l 1,, l n, upper z bounds u 1,, u n, and incumbent z propagate in (almost) topological order fixed point in one sweep if acyclic propagate only connected components with bound changes u 5 l 3 u 4 x 5 x 4 x 3 + 7z Ambros Gleixner, ZIB Separation, Propagation, Heuristics 33/48

89 propagating lagrangian variable bounds only propagate from right-hand side to left-hand side variable x l 3 x 1 2x 2 x 4 build a directed dependency graph 1 u 2 of lower bounds l 1,, l n, upper z bounds u 1,, u n, and incumbent z propagate in (almost) topological order fixed point in one sweep if acyclic propagate only connected components with bound changes u 5 l 3 u 4 x 5 x 4 x 3 + 7z props domreds cutoffs time [s] plain sorted relative -68% +0% -3% -59% Ambros Gleixner, ZIB Separation, Propagation, Heuristics 33/48

90 propagating lagrangian variable bounds only propagate from right-hand side to left-hand side variable x l 3 x 1 2x 2 x 4 build a directed dependency graph 1 u 2 of lower bounds l 1,, l n, upper z bounds u 1,, u n, and incumbent z propagate in (almost) topological order fixed point in one sweep if acyclic propagate only connected components with bound changes u 5 l 3 u 4 x 5 x 4 x 3 + 7z props domreds cutoffs time [s] plain sorted relative -68% +0% -3% -59% always 2% of total running time (except for two easy instances) Ambros Gleixner, ZIB Separation, Propagation, Heuristics 33/48

91 computational results What is the performance impact of OBBT and LVBs? SCIP 31 without OBBT, with OBBT only, and OBBT+LVB 605 mixed-integer and 347 continuous MINLPs from MINLPLib2 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 34/48

92 computational results What is the performance impact of OBBT and LVBs? SCIP 31 without OBBT, with OBBT only, and OBBT+LVB 605 mixed-integer and 347 continuous MINLPs from MINLPLib2 Solved instances with OBBT: +5 mixed-integer / +4 NLP with OBBT+LVB: +9 mixed-integer / +4 NLP Ambros Gleixner, ZIB Separation, Propagation, Heuristics 34/48

93 computational results What is the performance impact of OBBT and LVBs? SCIP 31 without OBBT, with OBBT only, and OBBT+LVB 605 mixed-integer and 347 continuous MINLPs from MINLPLib2 Solved instances with OBBT: +5 mixed-integer / +4 NLP with OBBT+LVB: +9 mixed-integer / +4 NLP Performance OBBT+LVB vs default mixed-integer: 23% faster continuous: 65% faster harder instances (at least 100 s): about 18% faster on both test sets often slowdown on easy, but game changer for hard/unsolved instances Ambros Gleixner, ZIB Separation, Propagation, Heuristics 34/48

94 primal heuristics: undercover

95 the motivation Large Neighborhood Search Heuristics common MIP heuristics: fix variables easy subproblem solve MIP: easy = few integralities MINLP: easy = few nonlinearities Ambros Gleixner, ZIB Separation, Propagation, Heuristics 36/48

96 the motivation Large Neighborhood Search Heuristics common MIP heuristics: fix variables easy subproblem solve MIP: easy = few integralities MINLP: easy = few nonlinearities Observation any MINLP can be reduced to a MIP by fixing (sufficiently many) variables Ambros Gleixner, ZIB Separation, Propagation, Heuristics 36/48

97 the motivation Large Neighborhood Search Heuristics common MIP heuristics: fix variables easy subproblem solve MIP: easy = few integralities MINLP: easy = few nonlinearities Observation any MINLP can be reduced to a MIP by fixing (sufficiently many) variables Idea fix minimum number of variables to obtain a sub-mip solution of LP/NLP relaxation as fixing values Ambros Gleixner, ZIB Separation, Propagation, Heuristics 36/48

98 a simple example max x 2 + x 3 st x 1 + x 2 + x 2 3 4, x 1, x 2, x 3 0, x 1, x 2 Z Fixing x 3 to any value within its bounds yields a linear subproblem Ambros Gleixner, ZIB Separation, Propagation, Heuristics 37/48

99 covers of an minlp Definition Let us be given a domain box [L, U] = i [L i, U i ], constraint functions g j [L, U] R, x g j (x) on [L, U], and a set C N = {1,, n} of variable indices Ambros Gleixner, ZIB Separation, Propagation, Heuristics 38/48

100 covers of an minlp Definition Let us be given a domain box [L, U] = i [L i, U i ], constraint functions g j [L, U] R, x g j (x) on [L, U], and a set C N = {1,, n} of variable indices We call C a cover of g j if and only if for all x [L, U] the set {(x, g j (x)) x [L, U], x k = x k for all k C} is an affine set intersected with [L, U] R Ambros Gleixner, ZIB Separation, Propagation, Heuristics 38/48

101 covers of an minlp Definition Let us be given a domain box [L, U] = i [L i, U i ], constraint functions g j [L, U] R, x g j (x) on [L, U], and a set C N = {1,, n} of variable indices We call C a cover of g j if and only if for all x [L, U] the set {(x, g j (x)) x [L, U], x k = x k for all k C} is an affine set intersected with [L, U] R We call C a cover of the MINLP if and only if C is a cover for g 1,, g m Ambros Gleixner, ZIB Separation, Propagation, Heuristics 38/48

102 co-occurence graph Definition Let P be an MINLP with g 1,, g m twice continuously differentiable on the interior of [L, U] We call G P = (V P, E P ) the co-occurrence graph of P with node set V P = {1,, n} and edge set E P = {ij i, j V, k {1,, m} 2 x i x j g k (x) 0} Ambros Gleixner, ZIB Separation, Propagation, Heuristics 39/48

103 co-occurence graph Definition Let P be an MINLP with g 1,, g m twice continuously differentiable on the interior of [L, U] We call G P = (V P, E P ) the co-occurrence graph of P with node set V P = {1,, n} and edge set E P = {ij i, j V, k {1,, m} 2 x i x j g k (x) 0} Example S s 1 min st s 1 t i a i for all i = 1, s j t 1 b j for all j = 1, T t 1 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 39/48

104 co-occurence graph Definition Let P be an MINLP with g 1,, g m twice continuously differentiable on the interior of [L, U] We call G P = (V P, E P ) the co-occurrence graph of P with node set V P = {1,, n} and edge set E P = {ij i, j V, k {1,, m} 2 x i x j g k (x) 0} Theorem [Berthold and G 2010, 2014] C {1,, n} is a cover of P if and only if it is a vertex cover of the co-occurrence graph G P Ambros Gleixner, ZIB Separation, Propagation, Heuristics 39/48

105 co-occurence graph Definition Let P be an MINLP with g 1,, g m twice continuously differentiable on the interior of [L, U] We call G P = (V P, E P ) the co-occurrence graph of P with node set V P = {1,, n} and edge set E P = {ij i, j V, k {1,, m} 2 x i x j g k (x) 0} Theorem [Berthold and G 2010, 2014] C {1,, n} is a cover of P if and only if it is a vertex cover of the co-occurrence graph G P Corollary Computing a minimum cover of an MINLP is NP-hard Ambros Gleixner, ZIB Separation, Propagation, Heuristics 39/48

106 computing a minimum cover Auxiliary binary variables α k = 1 x k is fixed in P C(α) = {k α k = 1} is a cover of P if and only if α k = 1 for all loops kk E P, (1) α k + α j 1 for all edges kj E p, k > j (2) Covering problem n min { α k (1), (2), α {0, 1} n } (3) k=1 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 40/48

107 computing a minimum cover Auxiliary binary variables α k = 1 x k is fixed in P C(α) = {k α k = 1} is a cover of P if and only if α k = 1 for all loops kk E P, (1) α k + α j 1 for all edges kj E p, k > j (2) Covering problem n min { α k (1), (2), α {0, 1} n } (3) k=1 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 40/48

108 computing a minimum cover Auxiliary binary variables α k = 1 x k is fixed in P C(α) = {k α k = 1} is a cover of P if and only if α k = 1 for all loops kk E P, (1) α k + α j 1 for all edges kj E p, k > j (2) Covering problem n min { α k (1), (2), α {0, 1} n } (3) k=1 Ambros Gleixner, ZIB Separation, Propagation, Heuristics 40/48

109 the undercover heuristic 1 Input: MINLP P 2 begin 3 compute a solution x of an approximation of P ; 4 round x k for all k I; 5 determine a cover C of P ; 6 solve the sub-mip of P given by fixing x k = x k for all k C; Ambros Gleixner, ZIB Separation, Propagation, Heuristics 41/48

110 the undercover heuristic 1 Input: MINLP P 2 begin 3 compute a solution x of an approximation of P ; 4 round x k for all k I; 5 determine a cover C of P ; 6 solve the sub-mip of P given by fixing x k = x k for all k C; Ambros Gleixner, ZIB Separation, Propagation, Heuristics 41/48

111 the undercover heuristic 1 Input: MINLP P 2 begin 3 compute a solution x of an approximation of P ; 4 round x k for all k I; 5 determine a cover C of P ; 6 solve the sub-mip of P given by fixing x k = x k for all k C; Ambros Gleixner, ZIB Separation, Propagation, Heuristics 41/48

112 the undercover heuristic 1 Input: MINLP P 2 begin 3 compute a solution x of an approximation of P ; 4 round x k for all k I; 5 determine a cover C of P ; 6 solve the sub-mip of P given by fixing x k = x k for all k C; Ambros Gleixner, ZIB Separation, Propagation, Heuristics 41/48

113 the undercover heuristic 1 Input: MINLP P 2 begin 3 compute a solution x of an approximation of P ; 4 round x k for all k I; 5 determine a cover C of P ; 6 solve the sub-mip of P given by fixing x k = x k for all k C; Ambros Gleixner, ZIB Separation, Propagation, Heuristics 41/48

114 the undercover heuristic 1 Input: MINLP P 2 begin 3 compute a solution x of an approximation of P ; 4 round x k for all k I; 5 determine a cover C of P ; 6 solve the sub-mip of P given by fixing x k = x k for all k C; Remark: MIP heuristics: trade-off fixing many vs few variables here: eliminate nonlinearities by fixing as few as possible variables minimum cover! Ambros Gleixner, ZIB Separation, Propagation, Heuristics 41/48

115 optimization matters The co-occurence graph of the bilinear program min st s 1 t i a i for all i = 1,, s j t 1 b j for all j = 1,, is S s 1 T t 1 The cover S of complicating variables may be arbitrarily large compared to the minimum cover {s 1, t 1 } Ambros Gleixner, ZIB Separation, Propagation, Heuristics 42/48

116 making it fly Fix-and-propagate [compare FischettiSalvagnin09] fix variables sequentially and tighten bounds after each fixing project fixing values to tightened domains Backtrack try alternative fixing values if infeasible Analyze infeasibility learn conflict/nogood constraints NLP postprocessing fix integer variables in sub-mip solution solve resulting sub-nlp to local optimality Ambros Gleixner, ZIB Separation, Propagation, Heuristics 43/48

117 making it fly Fix-and-propagate [compare FischettiSalvagnin09] fix variables sequentially and tighten bounds after each fixing project fixing values to tightened domains Backtrack try alternative fixing values if infeasible Analyze infeasibility learn conflict/nogood constraints NLP postprocessing fix integer variables in sub-mip solution solve resulting sub-nlp to local optimality Ambros Gleixner, ZIB Separation, Propagation, Heuristics 43/48

118 making it fly Fix-and-propagate [compare FischettiSalvagnin09] fix variables sequentially and tighten bounds after each fixing project fixing values to tightened domains Backtrack try alternative fixing values if infeasible Analyze infeasibility learn conflict/nogood constraints NLP postprocessing fix integer variables in sub-mip solution solve resulting sub-nlp to local optimality Ambros Gleixner, ZIB Separation, Propagation, Heuristics 43/48

119 making it fly Fix-and-propagate [compare FischettiSalvagnin09] fix variables sequentially and tighten bounds after each fixing project fixing values to tightened domains Backtrack try alternative fixing values if infeasible Analyze infeasibility learn conflict/nogood constraints NLP postprocessing fix integer variables in sub-mip solution solve resulting sub-nlp to local optimality Ambros Gleixner, ZIB Separation, Propagation, Heuristics 43/48

120 computational results Test set 149 MIQCPs from GloMIQO test set Comparison to other heuristics Undercover: solution for 76 instances (typically less than 01 sec) root heuristics: Baron 65, Couenne 55, SCIP 98 lower success rate on general MINLPs Running time distribution Ambros Gleixner, ZIB Separation, Propagation, Heuristics 44/48

121 summary

122 summary Separation several methods to separate by tight (supporting) hyperplanes reformulation vs changing the separated point Ambros Gleixner, ZIB Separation, Propagation, Heuristics 46/48

123 summary Separation several methods to separate by tight (supporting) hyperplanes reformulation vs changing the separated point Propagation optimization-based bound tightening propagate valid inequalities from dual information Ambros Gleixner, ZIB Separation, Propagation, Heuristics 46/48

124 summary Separation several methods to separate by tight (supporting) hyperplanes reformulation vs changing the separated point Propagation optimization-based bound tightening propagate valid inequalities from dual information Heuristics Undercover: largest sub-mip of an MINLP automatic structure detection via co-occurence graph Ambros Gleixner, ZIB Separation, Propagation, Heuristics 46/48

125 the scip optimization suite A toolbox for generating and solving constraint integer programs Free for academic use, available in source code at ZIMPL model and generate LPs, MIPs, and MINLPs SCIP MIP, MINLP and CIP solver, branch-cut-and-price framework SoPlex revised primal and dual simplex algorithm GCG generic branch-cut-and-price solver UG framework for parallelization of MIP and MINLP solvers Ambros Gleixner, ZIB Separation, Propagation, Heuristics 47/48

126 the scip community 27 active developers 4 running Bachelor and Master projects 15 running PhD projects 8 postdocs and professors 4 development centers in Germany Aachen: GCG Berlin: SCIP, SoPlex, UG, ZIMPL Darmstadt: SCIP and SCIP-SDP Erlangen-Nürnberg: SCIP many international contributors and users more than downloads per year from over 100 countries careers 10 awards for Masters and PhD theses: MOS, EURO, GOR, DMV 7 former developers are now building commercial optimization software at CPLEX, FICO Xpress, Gurobi, MOSEK, and GAMS Ambros Gleixner, ZIB Separation, Propagation, Heuristics 48/48

Cloud Branching. Timo Berthold. joint work with Domenico Salvagnin (Università degli Studi di Padova)

Cloud Branching. Timo Berthold. joint work with Domenico Salvagnin (Università degli Studi di Padova) Cloud Branching Timo Berthold Zuse Institute Berlin joint work with Domenico Salvagnin (Università degli Studi di Padova) DFG Research Center MATHEON Mathematics for key technologies 21/May/13, CPAIOR

More information

Solving convex MINLP problems with AIMMS

Solving convex MINLP problems with AIMMS Solving convex MINLP problems with AIMMS By Marcel Hunting Paragon Decision Technology BV An AIMMS White Paper August, 2012 Abstract This document describes the Quesada and Grossman algorithm that is implemented

More information

Introduction: Models, Model Building and Mathematical Optimization The Importance of Modeling Langauges for Solving Real World Problems

Introduction: Models, Model Building and Mathematical Optimization The Importance of Modeling Langauges for Solving Real World Problems Introduction: Models, Model Building and Mathematical Optimization The Importance of Modeling Langauges for Solving Real World Problems Josef Kallrath Structure of the Lecture: the Modeling Process survey

More information

Large Neighborhood Search beyond MIP

Large Neighborhood Search beyond MIP Konrad-Zuse-Zentrum für Informationstechnik Berlin Takustraße 7 D-14195 Berlin-Dahlem Germany TIMO BERTHOLD, STEFAN HEINZ, MARC E. PFETSCH 1, STEFAN VIGERSKE 2, Large Neighborhood Search beyond MIP 1 2

More information

SBB: A New Solver for Mixed Integer Nonlinear Programming

SBB: A New Solver for Mixed Integer Nonlinear Programming SBB: A New Solver for Mixed Integer Nonlinear Programming Michael R. Bussieck GAMS Development Corp. Arne S. Drud ARKI Consulting & Development A/S OR2001, Duisburg Overview! SBB = Simple Branch & Bound!

More information

Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams

Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams André Ciré University of Toronto John Hooker Carnegie Mellon University INFORMS 2014 Home Health Care Home health care delivery

More information

Chapter 13: Binary and Mixed-Integer Programming

Chapter 13: Binary and Mixed-Integer Programming Chapter 3: Binary and Mixed-Integer Programming The general branch and bound approach described in the previous chapter can be customized for special situations. This chapter addresses two special situations:

More information

Nonlinear Optimization: Algorithms 3: Interior-point methods

Nonlinear Optimization: Algorithms 3: Interior-point methods Nonlinear Optimization: Algorithms 3: Interior-point methods INSEAD, Spring 2006 Jean-Philippe Vert Ecole des Mines de Paris Jean-Philippe.Vert@mines.org Nonlinear optimization c 2006 Jean-Philippe Vert,

More information

CHAPTER 9. Integer Programming

CHAPTER 9. Integer Programming CHAPTER 9 Integer Programming An integer linear program (ILP) is, by definition, a linear program with the additional constraint that all variables take integer values: (9.1) max c T x s t Ax b and x integral

More information

Computer Sciences Department

Computer Sciences Department Computer Sciences Department Algorithms and Software for Convex Mixed Integer Nonlinear Programs Pierre Bonami Mustafa Kilinc Jeff Linderoth Technical Report #1664 October 2009 ALGORITHMS AND SOFTWARE

More information

Recovery of primal solutions from dual subgradient methods for mixed binary linear programming; a branch-and-bound approach

Recovery of primal solutions from dual subgradient methods for mixed binary linear programming; a branch-and-bound approach MASTER S THESIS Recovery of primal solutions from dual subgradient methods for mixed binary linear programming; a branch-and-bound approach PAULINE ALDENVIK MIRJAM SCHIERSCHER Department of Mathematical

More information

Linear Programming for Optimization. Mark A. Schulze, Ph.D. Perceptive Scientific Instruments, Inc.

Linear Programming for Optimization. Mark A. Schulze, Ph.D. Perceptive Scientific Instruments, Inc. 1. Introduction Linear Programming for Optimization Mark A. Schulze, Ph.D. Perceptive Scientific Instruments, Inc. 1.1 Definition Linear programming is the name of a branch of applied mathematics that

More information

2.3 Convex Constrained Optimization Problems

2.3 Convex Constrained Optimization Problems 42 CHAPTER 2. FUNDAMENTAL CONCEPTS IN CONVEX OPTIMIZATION Theorem 15 Let f : R n R and h : R R. Consider g(x) = h(f(x)) for all x R n. The function g is convex if either of the following two conditions

More information

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. #-approximation algorithm.

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. #-approximation algorithm. Approximation Algorithms 11 Approximation Algorithms Q Suppose I need to solve an NP-hard problem What should I do? A Theory says you're unlikely to find a poly-time algorithm Must sacrifice one of three

More information

Simplified Benders cuts for Facility Location

Simplified Benders cuts for Facility Location Simplified Benders cuts for Facility Location Matteo Fischetti, University of Padova based on joint work with Ivana Ljubic (ESSEC, Paris) and Markus Sinnl (ISOR, Vienna) Barcelona, November 2015 1 Apology

More information

Nonlinear Programming Methods.S2 Quadratic Programming

Nonlinear Programming Methods.S2 Quadratic Programming Nonlinear Programming Methods.S2 Quadratic Programming Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard A linearly constrained optimization problem with a quadratic objective

More information

INTEGER PROGRAMMING. Integer Programming. Prototype example. BIP model. BIP models

INTEGER PROGRAMMING. Integer Programming. Prototype example. BIP model. BIP models Integer Programming INTEGER PROGRAMMING In many problems the decision variables must have integer values. Example: assign people, machines, and vehicles to activities in integer quantities. If this is

More information

TOMLAB - For fast and robust largescale optimization in MATLAB

TOMLAB - For fast and robust largescale optimization in MATLAB The TOMLAB Optimization Environment is a powerful optimization and modeling package for solving applied optimization problems in MATLAB. TOMLAB provides a wide range of features, tools and services for

More information

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. !-approximation algorithm.

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. !-approximation algorithm. Approximation Algorithms Chapter Approximation Algorithms Q Suppose I need to solve an NP-hard problem What should I do? A Theory says you're unlikely to find a poly-time algorithm Must sacrifice one of

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms or: How I Learned to Stop Worrying and Deal with NP-Completeness Ong Jit Sheng, Jonathan (A0073924B) March, 2012 Overview Key Results (I) General techniques: Greedy algorithms

More information

Support Vector Machine (SVM)

Support Vector Machine (SVM) Support Vector Machine (SVM) CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin

More information

A Constraint Programming based Column Generation Approach to Nurse Rostering Problems

A Constraint Programming based Column Generation Approach to Nurse Rostering Problems Abstract A Constraint Programming based Column Generation Approach to Nurse Rostering Problems Fang He and Rong Qu The Automated Scheduling, Optimisation and Planning (ASAP) Group School of Computer Science,

More information

Lecture 3. Linear Programming. 3B1B Optimization Michaelmas 2015 A. Zisserman. Extreme solutions. Simplex method. Interior point method

Lecture 3. Linear Programming. 3B1B Optimization Michaelmas 2015 A. Zisserman. Extreme solutions. Simplex method. Interior point method Lecture 3 3B1B Optimization Michaelmas 2015 A. Zisserman Linear Programming Extreme solutions Simplex method Interior point method Integer programming and relaxation The Optimization Tree Linear Programming

More information

Linear Programming Notes V Problem Transformations

Linear Programming Notes V Problem Transformations Linear Programming Notes V Problem Transformations 1 Introduction Any linear programming problem can be rewritten in either of two standard forms. In the first form, the objective is to maximize, the material

More information

Proximal mapping via network optimization

Proximal mapping via network optimization L. Vandenberghe EE236C (Spring 23-4) Proximal mapping via network optimization minimum cut and maximum flow problems parametric minimum cut problem application to proximal mapping Introduction this lecture:

More information

Convex Programming Tools for Disjunctive Programs

Convex Programming Tools for Disjunctive Programs Convex Programming Tools for Disjunctive Programs João Soares, Departamento de Matemática, Universidade de Coimbra, Portugal Abstract A Disjunctive Program (DP) is a mathematical program whose feasible

More information

An Overview Of Software For Convex Optimization. Brian Borchers Department of Mathematics New Mexico Tech Socorro, NM 87801 borchers@nmt.

An Overview Of Software For Convex Optimization. Brian Borchers Department of Mathematics New Mexico Tech Socorro, NM 87801 borchers@nmt. An Overview Of Software For Convex Optimization Brian Borchers Department of Mathematics New Mexico Tech Socorro, NM 87801 borchers@nmt.edu In fact, the great watershed in optimization isn t between linearity

More information

The Gurobi Optimizer

The Gurobi Optimizer The Gurobi Optimizer Gurobi History Gurobi Optimization, founded July 2008 Zonghao Gu, Ed Rothberg, Bob Bixby Started code development March 2008 Gurobi Version 1.0 released May 2009 History of rapid,

More information

Chapter 11. 11.1 Load Balancing. Approximation Algorithms. Load Balancing. Load Balancing on 2 Machines. Load Balancing: Greedy Scheduling

Chapter 11. 11.1 Load Balancing. Approximation Algorithms. Load Balancing. Load Balancing on 2 Machines. Load Balancing: Greedy Scheduling Approximation Algorithms Chapter Approximation Algorithms Q. Suppose I need to solve an NP-hard problem. What should I do? A. Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one

More information

4.6 Linear Programming duality

4.6 Linear Programming duality 4.6 Linear Programming duality To any minimization (maximization) LP we can associate a closely related maximization (minimization) LP. Different spaces and objective functions but in general same optimal

More information

Duality in General Programs. Ryan Tibshirani Convex Optimization 10-725/36-725

Duality in General Programs. Ryan Tibshirani Convex Optimization 10-725/36-725 Duality in General Programs Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: duality in linear programs Given c R n, A R m n, b R m, G R r n, h R r : min x R n c T x max u R m, v R r b T

More information

Noncommercial Software for Mixed-Integer Linear Programming

Noncommercial Software for Mixed-Integer Linear Programming Noncommercial Software for Mixed-Integer Linear Programming J. T. Linderoth T. K. Ralphs December, 2004. Revised: January, 2005. Abstract We present an overview of noncommercial software tools for the

More information

Applied Algorithm Design Lecture 5

Applied Algorithm Design Lecture 5 Applied Algorithm Design Lecture 5 Pietro Michiardi Eurecom Pietro Michiardi (Eurecom) Applied Algorithm Design Lecture 5 1 / 86 Approximation Algorithms Pietro Michiardi (Eurecom) Applied Algorithm Design

More information

1 Introduction. Linear Programming. Questions. A general optimization problem is of the form: choose x to. max f(x) subject to x S. where.

1 Introduction. Linear Programming. Questions. A general optimization problem is of the form: choose x to. max f(x) subject to x S. where. Introduction Linear Programming Neil Laws TT 00 A general optimization problem is of the form: choose x to maximise f(x) subject to x S where x = (x,..., x n ) T, f : R n R is the objective function, S

More information

Special Session on Integrating Constraint Programming and Operations Research ISAIM 2016

Special Session on Integrating Constraint Programming and Operations Research ISAIM 2016 Titles Special Session on Integrating Constraint Programming and Operations Research ISAIM 2016 1. Grammar-Based Integer Programming Models and Methods for Employee Scheduling Problems 2. Detecting and

More information

Discrete Optimization

Discrete Optimization Discrete Optimization [Chen, Batson, Dang: Applied integer Programming] Chapter 3 and 4.1-4.3 by Johan Högdahl and Victoria Svedberg Seminar 2, 2015-03-31 Todays presentation Chapter 3 Transforms using

More information

A NEW LOOK AT CONVEX ANALYSIS AND OPTIMIZATION

A NEW LOOK AT CONVEX ANALYSIS AND OPTIMIZATION 1 A NEW LOOK AT CONVEX ANALYSIS AND OPTIMIZATION Dimitri Bertsekas M.I.T. FEBRUARY 2003 2 OUTLINE Convexity issues in optimization Historical remarks Our treatment of the subject Three unifying lines of

More information

Equilibrium computation: Part 1

Equilibrium computation: Part 1 Equilibrium computation: Part 1 Nicola Gatti 1 Troels Bjerre Sorensen 2 1 Politecnico di Milano, Italy 2 Duke University, USA Nicola Gatti and Troels Bjerre Sørensen ( Politecnico di Milano, Italy, Equilibrium

More information

24. The Branch and Bound Method

24. The Branch and Bound Method 24. The Branch and Bound Method It has serious practical consequences if it is known that a combinatorial problem is NP-complete. Then one can conclude according to the present state of science that no

More information

Scheduling of Mixed Batch-Continuous Production Lines

Scheduling of Mixed Batch-Continuous Production Lines Université Catholique de Louvain Faculté des Sciences Appliquées Scheduling of Mixed Batch-Continuous Production Lines Thèse présentée en vue de l obtention du grade de Docteur en Sciences Appliquées par

More information

Duality of linear conic problems

Duality of linear conic problems Duality of linear conic problems Alexander Shapiro and Arkadi Nemirovski Abstract It is well known that the optimal values of a linear programming problem and its dual are equal to each other if at least

More information

5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1 5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1 General Integer Linear Program: (ILP) min c T x Ax b x 0 integer Assumption: A, b integer The integrality condition

More information

Big Data Optimization at SAS

Big Data Optimization at SAS Big Data Optimization at SAS Imre Pólik et al. SAS Institute Cary, NC, USA Edinburgh, 2013 Outline 1 Optimization at SAS 2 Big Data Optimization at SAS The SAS HPA architecture Support vector machines

More information

Convex analysis and profit/cost/support functions

Convex analysis and profit/cost/support functions CALIFORNIA INSTITUTE OF TECHNOLOGY Division of the Humanities and Social Sciences Convex analysis and profit/cost/support functions KC Border October 2004 Revised January 2009 Let A be a subset of R m

More information

Using diversification, communication and parallelism to solve mixed-integer linear programs

Using diversification, communication and parallelism to solve mixed-integer linear programs Using diversification, communication and parallelism to solve mixed-integer linear programs R. Carvajal a,, S. Ahmed a, G. Nemhauser a, K. Furman b, V. Goel c, Y. Shao c a Industrial and Systems Engineering,

More information

Integrating Benders decomposition within Constraint Programming

Integrating Benders decomposition within Constraint Programming Integrating Benders decomposition within Constraint Programming Hadrien Cambazard, Narendra Jussien email: {hcambaza,jussien}@emn.fr École des Mines de Nantes, LINA CNRS FRE 2729 4 rue Alfred Kastler BP

More information

A Simple Introduction to Support Vector Machines

A Simple Introduction to Support Vector Machines A Simple Introduction to Support Vector Machines Martin Law Lecture for CSE 802 Department of Computer Science and Engineering Michigan State University Outline A brief history of SVM Large-margin linear

More information

Linear Programming. March 14, 2014

Linear Programming. March 14, 2014 Linear Programming March 1, 01 Parts of this introduction to linear programming were adapted from Chapter 9 of Introduction to Algorithms, Second Edition, by Cormen, Leiserson, Rivest and Stein [1]. 1

More information

Improved Formulations and Computational Strategies for the Solution and Nonconvex Generalized Disjunctive Programs

Improved Formulations and Computational Strategies for the Solution and Nonconvex Generalized Disjunctive Programs Carnegie Mellon University Research Showcase @ CMU Dissertations Theses and Dissertations Fall 9-2015 Improved Formulations and Computational Strategies for the Solution and Nonconvex Generalized Disjunctive

More information

OPTIMIZED STAFF SCHEDULING AT SWISSPORT

OPTIMIZED STAFF SCHEDULING AT SWISSPORT Gurobi User Conference, Frankfurt, 01.02.2016 OPTIMIZED STAFF SCHEDULING AT SWISSPORT Prof. Dr. Andreas Klinkert Dr. Peter Fusek Dipl. Ing. Roman Berner Rita Thalmann Simona Segessenmann Zurich University

More information

Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows

Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows TECHNISCHE UNIVERSITEIT EINDHOVEN Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows Lloyd A. Fasting May 2014 Supervisors: dr. M. Firat dr.ir. M.A.A. Boon J. van Twist MSc. Contents

More information

Support Vector Machines Explained

Support Vector Machines Explained March 1, 2009 Support Vector Machines Explained Tristan Fletcher www.cs.ucl.ac.uk/staff/t.fletcher/ Introduction This document has been written in an attempt to make the Support Vector Machines (SVM),

More information

Minimizing costs for transport buyers using integer programming and column generation. Eser Esirgen

Minimizing costs for transport buyers using integer programming and column generation. Eser Esirgen MASTER STHESIS Minimizing costs for transport buyers using integer programming and column generation Eser Esirgen DepartmentofMathematicalSciences CHALMERS UNIVERSITY OF TECHNOLOGY UNIVERSITY OF GOTHENBURG

More information

How to speed-up hard problem resolution using GLPK?

How to speed-up hard problem resolution using GLPK? How to speed-up hard problem resolution using GLPK? Onfroy B. & Cohen N. September 27, 2010 Contents 1 Introduction 2 1.1 What is GLPK?.......................................... 2 1.2 GLPK, the best one?.......................................

More information

Linear Programming. Widget Factory Example. Linear Programming: Standard Form. Widget Factory Example: Continued.

Linear Programming. Widget Factory Example. Linear Programming: Standard Form. Widget Factory Example: Continued. Linear Programming Widget Factory Example Learning Goals. Introduce Linear Programming Problems. Widget Example, Graphical Solution. Basic Theory:, Vertices, Existence of Solutions. Equivalent formulations.

More information

3. Linear Programming and Polyhedral Combinatorics

3. Linear Programming and Polyhedral Combinatorics Massachusetts Institute of Technology Handout 6 18.433: Combinatorial Optimization February 20th, 2009 Michel X. Goemans 3. Linear Programming and Polyhedral Combinatorics Summary of what was seen in the

More information

Optimization Modeling for Mining Engineers

Optimization Modeling for Mining Engineers Optimization Modeling for Mining Engineers Alexandra M. Newman Division of Economics and Business Slide 1 Colorado School of Mines Seminar Outline Linear Programming Integer Linear Programming Slide 2

More information

Adaptive Linear Programming Decoding

Adaptive Linear Programming Decoding Adaptive Linear Programming Decoding Mohammad H. Taghavi and Paul H. Siegel ECE Department, University of California, San Diego Email: (mtaghavi, psiegel)@ucsd.edu ISIT 2006, Seattle, USA, July 9 14, 2006

More information

Scheduling Shop Scheduling. Tim Nieberg

Scheduling Shop Scheduling. Tim Nieberg Scheduling Shop Scheduling Tim Nieberg Shop models: General Introduction Remark: Consider non preemptive problems with regular objectives Notation Shop Problems: m machines, n jobs 1,..., n operations

More information

An interval linear programming contractor

An interval linear programming contractor An interval linear programming contractor Introduction Milan Hladík Abstract. We consider linear programming with interval data. One of the most challenging problems in this topic is to determine or tight

More information

2007/26. A tighter continuous time formulation for the cyclic scheduling of a mixed plant

2007/26. A tighter continuous time formulation for the cyclic scheduling of a mixed plant CORE DISCUSSION PAPER 2007/26 A tighter continuous time formulation for the cyclic scheduling of a mixed plant Yves Pochet 1, François Warichet 2 March 2007 Abstract In this paper, based on the cyclic

More information

Design, synthesis and scheduling of multipurpose batch plants via an effective continuous-time formulation

Design, synthesis and scheduling of multipurpose batch plants via an effective continuous-time formulation Computers and Chemical Engineering 25 (2001) 665 674 www.elsevier.com/locate/compchemeng Design, synthesis and scheduling of multipurpose batch plants via an effective continuous-time formulation X. Lin,

More information

Lecture 11: 0-1 Quadratic Program and Lower Bounds

Lecture 11: 0-1 Quadratic Program and Lower Bounds Lecture : - Quadratic Program and Lower Bounds (3 units) Outline Problem formulations Reformulation: Linearization & continuous relaxation Branch & Bound Method framework Simple bounds, LP bound and semidefinite

More information

11. APPROXIMATION ALGORITHMS

11. APPROXIMATION ALGORITHMS 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005

More information

Summer course on Convex Optimization. Fifth Lecture Interior-Point Methods (1) Michel Baes, K.U.Leuven Bharath Rangarajan, U.

Summer course on Convex Optimization. Fifth Lecture Interior-Point Methods (1) Michel Baes, K.U.Leuven Bharath Rangarajan, U. Summer course on Convex Optimization Fifth Lecture Interior-Point Methods (1) Michel Baes, K.U.Leuven Bharath Rangarajan, U.Minnesota Interior-Point Methods: the rebirth of an old idea Suppose that f is

More information

Arrangements And Duality

Arrangements And Duality Arrangements And Duality 3.1 Introduction 3 Point configurations are tbe most basic structure we study in computational geometry. But what about configurations of more complicated shapes? For example,

More information

. P. 4.3 Basic feasible solutions and vertices of polyhedra. x 1. x 2

. P. 4.3 Basic feasible solutions and vertices of polyhedra. x 1. x 2 4. Basic feasible solutions and vertices of polyhedra Due to the fundamental theorem of Linear Programming, to solve any LP it suffices to consider the vertices (finitely many) of the polyhedron P of the

More information

Algorithm Design and Analysis

Algorithm Design and Analysis Algorithm Design and Analysis LECTURE 27 Approximation Algorithms Load Balancing Weighted Vertex Cover Reminder: Fill out SRTEs online Don t forget to click submit Sofya Raskhodnikova 12/6/2011 S. Raskhodnikova;

More information

Lecture 2: August 29. Linear Programming (part I)

Lecture 2: August 29. Linear Programming (part I) 10-725: Convex Optimization Fall 2013 Lecture 2: August 29 Lecturer: Barnabás Póczos Scribes: Samrachana Adhikari, Mattia Ciollaro, Fabrizio Lecci Note: LaTeX template courtesy of UC Berkeley EECS dept.

More information

GAMS, Condor and the Grid: Solving Hard Optimization Models in Parallel. Michael C. Ferris University of Wisconsin

GAMS, Condor and the Grid: Solving Hard Optimization Models in Parallel. Michael C. Ferris University of Wisconsin GAMS, Condor and the Grid: Solving Hard Optimization Models in Parallel Michael C. Ferris University of Wisconsin Parallel Optimization Aid search for global solutions (typically in non-convex or discrete)

More information

Linear Programming. Solving LP Models Using MS Excel, 18

Linear Programming. Solving LP Models Using MS Excel, 18 SUPPLEMENT TO CHAPTER SIX Linear Programming SUPPLEMENT OUTLINE Introduction, 2 Linear Programming Models, 2 Model Formulation, 4 Graphical Linear Programming, 5 Outline of Graphical Procedure, 5 Plotting

More information

Big Data - Lecture 1 Optimization reminders

Big Data - Lecture 1 Optimization reminders Big Data - Lecture 1 Optimization reminders S. Gadat Toulouse, Octobre 2014 Big Data - Lecture 1 Optimization reminders S. Gadat Toulouse, Octobre 2014 Schedule Introduction Major issues Examples Mathematics

More information

3.1 Solving Systems Using Tables and Graphs

3.1 Solving Systems Using Tables and Graphs Algebra 2 Chapter 3 3.1 Solve Systems Using Tables & Graphs 3.1 Solving Systems Using Tables and Graphs A solution to a system of linear equations is an that makes all of the equations. To solve a system

More information

Several Views of Support Vector Machines

Several Views of Support Vector Machines Several Views of Support Vector Machines Ryan M. Rifkin Honda Research Institute USA, Inc. Human Intention Understanding Group 2007 Tikhonov Regularization We are considering algorithms of the form min

More information

In this paper we present a branch-and-cut algorithm for

In this paper we present a branch-and-cut algorithm for SOLVING A TRUCK DISPATCHING SCHEDULING PROBLEM USING BRANCH-AND-CUT ROBERT E. BIXBY Rice University, Houston, Texas EVA K. LEE Georgia Institute of Technology, Atlanta, Georgia (Received September 1994;

More information

Optimization of Supply Chain Networks

Optimization of Supply Chain Networks Optimization of Supply Chain Networks M. Herty TU Kaiserslautern September 2006 (2006) 1 / 41 Contents 1 Supply Chain Modeling 2 Networks 3 Optimization Continuous optimal control problem Discrete optimal

More information

5.1 Bipartite Matching

5.1 Bipartite Matching CS787: Advanced Algorithms Lecture 5: Applications of Network Flow In the last lecture, we looked at the problem of finding the maximum flow in a graph, and how it can be efficiently solved using the Ford-Fulkerson

More information

An Introduction on SemiDefinite Program

An Introduction on SemiDefinite Program An Introduction on SemiDefinite Program from the viewpoint of computation Hayato Waki Institute of Mathematics for Industry, Kyushu University 2015-10-08 Combinatorial Optimization at Work, Berlin, 2015

More information

Solutions Of Some Non-Linear Programming Problems BIJAN KUMAR PATEL. Master of Science in Mathematics. Prof. ANIL KUMAR

Solutions Of Some Non-Linear Programming Problems BIJAN KUMAR PATEL. Master of Science in Mathematics. Prof. ANIL KUMAR Solutions Of Some Non-Linear Programming Problems A PROJECT REPORT submitted by BIJAN KUMAR PATEL for the partial fulfilment for the award of the degree of Master of Science in Mathematics under the supervision

More information

High-performance local search for planning maintenance of EDF nuclear park

High-performance local search for planning maintenance of EDF nuclear park High-performance local search for planning maintenance of EDF nuclear park Frédéric Gardi Karim Nouioua Bouygues e-lab, Paris fgardi@bouygues.com Laboratoire d'informatique Fondamentale - CNRS UMR 6166,

More information

Some representability and duality results for convex mixed-integer programs.

Some representability and duality results for convex mixed-integer programs. Some representability and duality results for convex mixed-integer programs. Santanu S. Dey Joint work with Diego Morán and Juan Pablo Vielma December 17, 2012. Introduction About Motivation Mixed integer

More information

International Doctoral School Algorithmic Decision Theory: MCDA and MOO

International Doctoral School Algorithmic Decision Theory: MCDA and MOO International Doctoral School Algorithmic Decision Theory: MCDA and MOO Lecture 2: Multiobjective Linear Programming Department of Engineering Science, The University of Auckland, New Zealand Laboratoire

More information

Mathematical finance and linear programming (optimization)

Mathematical finance and linear programming (optimization) Mathematical finance and linear programming (optimization) Geir Dahl September 15, 2009 1 Introduction The purpose of this short note is to explain how linear programming (LP) (=linear optimization) may

More information

Introduction to Algebraic Geometry. Bézout s Theorem and Inflection Points

Introduction to Algebraic Geometry. Bézout s Theorem and Inflection Points Introduction to Algebraic Geometry Bézout s Theorem and Inflection Points 1. The resultant. Let K be a field. Then the polynomial ring K[x] is a unique factorisation domain (UFD). Another example of a

More information

Chapter 3 INTEGER PROGRAMMING 3.1 INTRODUCTION. Robert Bosch. Michael Trick

Chapter 3 INTEGER PROGRAMMING 3.1 INTRODUCTION. Robert Bosch. Michael Trick Chapter 3 INTEGER PROGRAMMING Robert Bosch Oberlin College Oberlin OH, USA Michael Trick Carnegie Mellon University Pittsburgh PA, USA 3.1 INTRODUCTION Over the last 20 years, the combination of faster

More information

Multi-layer MPLS Network Design: the Impact of Statistical Multiplexing

Multi-layer MPLS Network Design: the Impact of Statistical Multiplexing Multi-layer MPLS Network Design: the Impact of Statistical Multiplexing Pietro Belotti, Antonio Capone, Giuliana Carello, Federico Malucelli Tepper School of Business, Carnegie Mellon University, Pittsburgh

More information

Guessing Game: NP-Complete?

Guessing Game: NP-Complete? Guessing Game: NP-Complete? 1. LONGEST-PATH: Given a graph G = (V, E), does there exists a simple path of length at least k edges? YES 2. SHORTEST-PATH: Given a graph G = (V, E), does there exists a simple

More information

Stefan Vigerske Curriculum Vitae

Stefan Vigerske Curriculum Vitae Stefan Vigerske Curriculum Vitae E-Mail: stefan@math.hu-berlin.de Academic Degrees 2005 Diploma in Mathematics, Humboldt-University Berlin 2005 Diploma in Computer Science, Humboldt-University Berlin Awards

More information

Introduction to Linear Programming (LP) Mathematical Programming (MP) Concept

Introduction to Linear Programming (LP) Mathematical Programming (MP) Concept Introduction to Linear Programming (LP) Mathematical Programming Concept LP Concept Standard Form Assumptions Consequences of Assumptions Solution Approach Solution Methods Typical Formulations Massachusetts

More information

Tutorial: Operations Research in Constraint Programming

Tutorial: Operations Research in Constraint Programming Tutorial: Operations Research in Constraint Programming John Hooker Carnegie Mellon University May 2009 Revised June 2009 May 2009 Slide 1 Motivation Benders decomposition allows us to apply CP and OR

More information

Numerisches Rechnen. (für Informatiker) M. Grepl J. Berger & J.T. Frings. Institut für Geometrie und Praktische Mathematik RWTH Aachen

Numerisches Rechnen. (für Informatiker) M. Grepl J. Berger & J.T. Frings. Institut für Geometrie und Praktische Mathematik RWTH Aachen (für Informatiker) M. Grepl J. Berger & J.T. Frings Institut für Geometrie und Praktische Mathematik RWTH Aachen Wintersemester 2010/11 Problem Statement Unconstrained Optimality Conditions Constrained

More information

A new Branch-and-Price Algorithm for the Traveling Tournament Problem (TTP) Column Generation 2008, Aussois, France

A new Branch-and-Price Algorithm for the Traveling Tournament Problem (TTP) Column Generation 2008, Aussois, France A new Branch-and-Price Algorithm for the Traveling Tournament Problem (TTP) Column Generation 2008, Aussois, France Stefan Irnich 1 sirnich@or.rwth-aachen.de RWTH Aachen University Deutsche Post Endowed

More information

On Minimal Valid Inequalities for Mixed Integer Conic Programs

On Minimal Valid Inequalities for Mixed Integer Conic Programs On Minimal Valid Inequalities for Mixed Integer Conic Programs Fatma Kılınç Karzan June 27, 2013 Abstract We study mixed integer conic sets involving a general regular (closed, convex, full dimensional,

More information

Transportation Polytopes: a Twenty year Update

Transportation Polytopes: a Twenty year Update Transportation Polytopes: a Twenty year Update Jesús Antonio De Loera University of California, Davis Based on various papers joint with R. Hemmecke, E.Kim, F. Liu, U. Rothblum, F. Santos, S. Onn, R. Yoshida,

More information

OPRE 6201 : 2. Simplex Method

OPRE 6201 : 2. Simplex Method OPRE 6201 : 2. Simplex Method 1 The Graphical Method: An Example Consider the following linear program: Max 4x 1 +3x 2 Subject to: 2x 1 +3x 2 6 (1) 3x 1 +2x 2 3 (2) 2x 2 5 (3) 2x 1 +x 2 4 (4) x 1, x 2

More information

Linear Programming in Matrix Form

Linear Programming in Matrix Form Linear Programming in Matrix Form Appendix B We first introduce matrix concepts in linear programming by developing a variation of the simplex method called the revised simplex method. This algorithm,

More information

Dynamic programming. Doctoral course Optimization on graphs - Lecture 4.1. Giovanni Righini. January 17 th, 2013

Dynamic programming. Doctoral course Optimization on graphs - Lecture 4.1. Giovanni Righini. January 17 th, 2013 Dynamic programming Doctoral course Optimization on graphs - Lecture.1 Giovanni Righini January 1 th, 201 Implicit enumeration Combinatorial optimization problems are in general NP-hard and we usually

More information

Computing a Nearest Correlation Matrix with Factor Structure

Computing a Nearest Correlation Matrix with Factor Structure Computing a Nearest Correlation Matrix with Factor Structure Nick Higham School of Mathematics The University of Manchester higham@ma.man.ac.uk http://www.ma.man.ac.uk/~higham/ Joint work with Rüdiger

More information

Linear Programming: Theory and Applications

Linear Programming: Theory and Applications Linear Programming: Theory and Applications Catherine Lewis May 11, 2008 1 Contents 1 Introduction to Linear Programming 3 1.1 What is a linear program?...................... 3 1.2 Assumptions.............................

More information

Column Generation in GAMS Extending the GAMS Branch-and-Cut-and-Heuristic (BCH) Facility

Column Generation in GAMS Extending the GAMS Branch-and-Cut-and-Heuristic (BCH) Facility Column Generation in GAMS Extending the GAMS Branch-and-Cut-and-Heuristic (BCH) Facility Michael R. Bussieck MBussieck@gams.com GAMS Software GmbH GAMS Development Corp 83rd Working Group Meeting Real

More information