Chapter 2 Integer Programming. Paragraph 2 Branch and Bound

Similar documents
5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

INTEGER PROGRAMMING. Integer Programming. Prototype example. BIP model. BIP models

A Constraint Programming based Column Generation Approach to Nurse Rostering Problems

Outline. NP-completeness. When is a problem easy? When is a problem hard? Today. Euler Circuits

2. (a) Explain the strassen s matrix multiplication. (b) Write deletion algorithm, of Binary search tree. [8+8]

Guessing Game: NP-Complete?

Chapter 13: Binary and Mixed-Integer Programming

Load Balancing. Load Balancing 1 / 24

Dynamic programming. Doctoral course Optimization on graphs - Lecture 4.1. Giovanni Righini. January 17 th, 2013

Backdoors to Combinatorial Optimization: Feasibility and Optimality

A Column-Generation and Branch-and-Cut Approach to the Bandwidth-Packing Problem

24. The Branch and Bound Method

5.1 Bipartite Matching

Data Warehousing und Data Mining

Recovery of primal solutions from dual subgradient methods for mixed binary linear programming; a branch-and-bound approach

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. #-approximation algorithm.

Approximation Algorithms

AI: A Modern Approach, Chpts. 3-4 Russell and Norvig

Scheduling Single Machine Scheduling. Tim Nieberg

Cloud Branching. Timo Berthold. joint work with Domenico Salvagnin (Università degli Studi di Padova)

The Taxman Game. Robert K. Moniot September 5, 2003

Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams

1.204 Lecture 10. Greedy algorithms: Job scheduling. Greedy method

How To Find An Optimal Search Protocol For An Oblivious Cell

Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows

High Performance Computing for Operation Research

Test Case Design Techniques

Complexity Classes P and NP

Security-Aware Beacon Based Network Monitoring

Optimizing Description Logic Subsumption

Linear Programming for Optimization. Mark A. Schulze, Ph.D. Perceptive Scientific Instruments, Inc.

SIMS 255 Foundations of Software Design. Complexity and NP-completeness

Reconnect 04 Solving Integer Programs with Branch and Bound (and Branch and Cut)

Dynamic programming formulation

Computer Algorithms. NP-Complete Problems. CISC 4080 Yanjun Li

International Journal of Advanced Research in Computer Science and Software Engineering

A Branch and Bound Algorithm for Solving the Binary Bi-level Linear Programming Problem

Lecture 3. Linear Programming. 3B1B Optimization Michaelmas 2015 A. Zisserman. Extreme solutions. Simplex method. Interior point method

Basic Components of an LP:

Chapter 3 INTEGER PROGRAMMING 3.1 INTRODUCTION. Robert Bosch. Michael Trick

Single machine models: Maximum Lateness -12- Approximation ratio for EDD for problem 1 r j,d j < 0 L max. structure of a schedule Q...

Constant Factor Approximation Algorithm for the Knapsack Median Problem

Complexity Theory. IE 661: Scheduling Theory Fall 2003 Satyaki Ghosh Dastidar

Equilibrium computation: Part 1

Graph Database Proof of Concept Report

Branch, Cut, and Price: Sequential and Parallel

MODELS AND ALGORITHMS FOR WORKFORCE ALLOCATION AND UTILIZATION

The Problem of Scheduling Technicians and Interventions in a Telecommunications Company

Laboratory work in AI: First steps in Poker Playing Agents and Opponent Modeling

Applied Algorithm Design Lecture 5

Y. Xiang, Constraint Satisfaction Problems

A search based Sudoku solver

Base Conversion written by Cathy Saxton

Practical Guide to the Simplex Method of Linear Programming

A Labeling Algorithm for the Maximum-Flow Network Problem

TREE BASIC TERMINOLOGIES

Route optimization applied to school transports A method combining column generation with greedy heuristics

Chapter Load Balancing. Approximation Algorithms. Load Balancing. Load Balancing on 2 Machines. Load Balancing: Greedy Scheduling

CAD Algorithms. P and NP

Discrete Optimization

Bounded Treewidth in Knowledge Representation and Reasoning 1

Integrating Benders decomposition within Constraint Programming

Modeling and Solving the Capacitated Vehicle Routing Problem on Trees

Degeneracy in Linear Programming

Routing in Line Planning for Public Transport

3. Linear Programming and Polyhedral Combinatorics

Minerva Access is the Institutional Repository of The University of Melbourne

A Reference Point Method to Triple-Objective Assignment of Supporting Services in a Healthcare Institution. Bartosz Sawik

Optimization Modeling for Mining Engineers

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. !-approximation algorithm.

10CS35: Data Structures Using C

In this paper we present a branch-and-cut algorithm for

Solving convex MINLP problems with AIMMS

CHAPTER 9. Integer Programming

Exam study sheet for CS2711. List of topics

Chapter 11 Monte Carlo Simulation

Question 2: How will changes in the objective function s coefficients change the optimal solution?

Minesweeper as a Constraint Satisfaction Problem

Linear Programming. Solving LP Models Using MS Excel, 18

High-performance local search for planning maintenance of EDF nuclear park

Linear Programming Notes VII Sensitivity Analysis


1) The postfix expression for the infix expression A+B*(C+D)/F+D*E is ABCD+*F/DE*++

1. The memory address of the first element of an array is called A. floor address B. foundation addressc. first address D.

Continuous Fastest Path Planning in Road Networks by Mining Real-Time Traffic Event Information

A Beam Search Heuristic for Multi-Mode Single Resource Constrained Project Scheduling

Bicolored Shortest Paths in Graphs with Applications to Network Overlay Design

Page 1. CSCE 310J Data Structures & Algorithms. CSCE 310J Data Structures & Algorithms. P, NP, and NP-Complete. Polynomial-Time Algorithms

Chapter 15: Dynamic Programming

New Admissible Heuristics for Domain-Independent Planning

A Weighted-Sum Mixed Integer Program for Bi-Objective Dynamic Portfolio Optimization

An Implementation of a Constraint Branching Algorithm for Optimally Solving Airline Crew Pairing Problems

Random Map Generator v1.0 User s Guide

4.6 Linear Programming duality

Classification - Examples

Offline sorting buffers on Line

An optimization model for aircraft maintenance scheduling and re-assignment

Question 2: How do you solve a linear programming problem with a graph?

Clearing Algorithms for Barter Exchange Markets: Enabling Nationwide Kidney Exchanges

EdExcel Decision Mathematics 1

Transcription:

Chapter 2 Integer Programming Paragraph 2 Branch and Bound

What we did so far We studied linear programming and saw that it is solvable in P. We gave a sufficient condition (total unimodularity) that simplex will return an integer solution. Shortest Path Minimum Spanning Tree Maximum Flow Min-Cost Flow How can we cope with general integer programs? CS 149 - Intro to CO 2

Tree Search As an example, assume we have to solve the Knapsack Problem. Recall that there are 2 n possible combinations of knapsack items. The brute-force approach to solve the problem is to enumerate all combinations, see which ones are feasible, and which one of those achieves maximum profit. A systematic way of enumerating all solutions is via backtracking. CS 149 - Intro to CO 3

Tree Search Assume we order the variables x 1,..,x n. A recursive way of enumerating all solutions is to set x 1 to 0 first and to recursively enumerate all solutions for KP(x 2,..,x n, p, w, C). Then we set x 1 to 1 and enumerate all solutions for KP(x 2,..,x n, p, w, C-w 1 ). This procedure yields to a search tree! CS 149 - Intro to CO 4

Tree Search x 1 0 1 x 2 0 1 x 3 0 1 x = (1,0,0) T CS 149 - Intro to CO 5

Combinatorial Explosions Enumerating all possible solutions is of course not feasible when there are too many items. What is too many? 500? 200? 100? 50? 10? Take a guess! Assume we can investigate 1 solution per cpu cycle at a rate of 10 GHz (that s 10 billion per second). Then, enumerating all Knapsacks with 60 items takes more than 85 years! This effect is called a combinatorial explosion. If NP P, it cannot be avoided. However, we can aim at pushing the intractable instance sizes as far as possible far enough to solve real-world instances. This is what combinatorial optimization is all about! CS 149 - Intro to CO 6

Implicit Enumeration We cannot afford to enumerate all combinations. We must try to enumerate the overwhelming part of all combinations implicitly! The only way to do this is by intelligent inference. It is usually easy to find a first solution. The core question to ask for an optimization problem is: Can we achieve a better solution? Answering this question is of course NP-complete. Consequently, we have to try to estimate intelligently. CS 149 - Intro to CO 7

Relaxations We can achieve an upper bound on an optimization problem like Knapsack by computing an optimal solution over a larger set of feasible solutions. We can allow more solutions by getting rid of some constraints - hopefully in such a way that the relaxed problem is easier to solve. This approach is generally called a relaxation. The milder the effect of a relaxation on the objective value, the better our estimate! CS 149 - Intro to CO 8

Linear Relaxation The most commonly used relaxation consists in dropping the constraint that variables be integer. In Knapsack for instance, we replace x i {0,1} by 0 x i 1. Then, optimizing the relaxed problem calls for solving a linear program and we know how to optimize LPs quickly! CS 149 - Intro to CO 9

Relaxations What does a relaxation give us? Dominance: If the relaxation value is lower (for minimization: greater) or equal than the best known solution All solutions with the current prefix are sub-optimal and need not be looked at at all! Optimality: If the relaxation returns a feasible solution for our original problem This solution dominates all other feasible solutions, they need not be looked at at all! Infeasibility: If the relaxation is infeasible There exists no feasible solution with the current prefix, all such combinations need not be looked at at all! In all these cases, we are not going to expand the search tree below the current node further We prune the search! CS 149 - Intro to CO 10

Example Knapsack Instance Maximize 9 x 1 + 3 x 2 + 5 x 3 + 3 x 4 such that 5 x 1 + 2 x 2 + 5 x 3 + 4 x 4 10 x 1,x 2,x 3,x 4 {0,1} LP Relaxation Maximize 9 x 1 + 3 x 2 + 5 x 3 + 3 x 4 such that 5 x 1 + 2 x 2 + 5 x 3 + 4 x 4 10 0 x 1,x 2,x 3,x 4 1 CS 149 - Intro to CO 11

Example 0 1 15 x 1 x 2 10.25 15 x 3 8 10.25 14 15 x 4 6 10.25 14.25 X 8 X 12 X CS 149 - Intro to CO 12

Branching Direction Selection In our general Branch-and-Bound scheme, we have some liberty: Which node shall we look at next? Which variable should we branch on? We would like to dive into the search tree in order to find a feasible solution (a lower bound) quickly. When diving, the question which node to pick next comes down to: which of the two son nodes shall we follow first? CS 149 - Intro to CO 13

Example x 1 15 15 1 0 x 2 15 1 0 10.25 x 3 0 1 14 14.25 X x 4 0 12 1 X CS 149 - Intro to CO 14

Branching Variable Selection In our general Branch-and-Bound scheme, we have some liberty: Which node shall we look at next? Which variable should we branch on? In order to have a chance of improving our upper bound, we need to branch on a fractional variable. In KP, there is exactly one. CS 149 - Intro to CO 15

Example x 3 0 14.25 x 4 1 x 4 0 15 x 3 1 14 X 12 CS 149 - Intro to CO 16

Liberties in B&B So far, we took the liberty to select our own branching values and variables. Value selection is a special case of node selection in depth first search. The way how we traverse the search tree is generally determined by our search strategy. Variable selection is a special case of branching constraint selection. Very many different ways to partition the search space are possible. CS 149 - Intro to CO 17

Search Strategies When choosing the next node, we would like: to find a near optimal solution quickly (lower bound improvement in maximization) not to jump too much to make use of incremental data-structures and keep the memory requirements in limits. CS 149 - Intro to CO 18

Search Strategies Depth First Search Finds feasible solutions quickly. Is very memory efficient. Can easily get stuck in sub-optimal parts of the search space. Best First Search Look at the node with best relaxation value next. Is provably optimal in the sense that it never visits a node that could be pruned otherwise. A lot of jumping is necessary and memory requirements are prohibitively large (often search degenerates to breadth first search). CS 149 - Intro to CO 19

Search Strategies Depth First Search with Best Backtracking Is a mix of both depth and best first search: perform depth first search until a leaf is found, then backtrack to the node with best relaxation value and so on. Much less jumping than best first search. Is more memory efficient than best first search, but less than DFS could still be very memory intensive. Least Discrepancy Search Follow DFS with heuristic branching direction selection. Investigate leaves in order of increasing discrepancy wrt that heuristic. Memory requirements are within limits. Often finds good solutions early in the search. CS 149 - Intro to CO 20

Branching Constraint Selection When partitioning the search space, we would like: to reduce the relaxation value as quickly as possible (upper bound improvement in maximization) to avoid to double our workload which can happen for example when choosing the wrong branching variable The easiest way to partition the search is by branching on one variable. CS 149 - Intro to CO 21

Branching Constraint Selection Unary Branching Constraints Choose the variable which has a fractional part closest to ½. Try to estimate how much enforcing the integrality of a variable will cost at least degradation method. Follow user-defined priorities. Choose a random variable and combine with restarts. Empirically, we prefer balanced search trees over degenerated branches. CS 149 - Intro to CO 22

Branching Constraint Selection In some cases, unary branching constraints cannot achieve balance: Σ x i = 1, x i = 1 has big, x i = 0 almost no effect! Special Ordered Sets SOS-Branching Idea: Σ i I x i = 1 or Σ i I x i = 1. SOS type 1 An ordered set of variables, where at most one variable may take on a nonzero value. SOS type 2 An ordered set of variables, where at most two variables may take on nonzero values, and if two variables are nonzero, they must be adjacent in the set. SOS type 3 A set of 0-1 variables only one of which may be selected to have the value 1, the other variables in the set having the value 0. CS 149 - Intro to CO 23

Thank you!