Lecture 11: 0-1 Quadratic Program and Lower Bounds



Similar documents
Lecture 5 Principal Minors and the Hessian

Proximal mapping via network optimization

Discrete Optimization

An Introduction on SemiDefinite Program

DATA ANALYSIS II. Matrix Algorithms

Lecture 3. Linear Programming. 3B1B Optimization Michaelmas 2015 A. Zisserman. Extreme solutions. Simplex method. Interior point method

5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

Nonlinear Programming Methods.S2 Quadratic Programming

CHAPTER 9. Integer Programming

An Overview Of Software For Convex Optimization. Brian Borchers Department of Mathematics New Mexico Tech Socorro, NM

Support Vector Machines Explained

Duality in General Programs. Ryan Tibshirani Convex Optimization /36-725

5.1 Bipartite Matching

Numerical Methods I Eigenvalue Problems

2.3 Convex Constrained Optimization Problems

Lecture 13 Linear quadratic Lyapunov theory

Linear Programming. Widget Factory Example. Linear Programming: Standard Form. Widget Factory Example: Continued.

On SDP- and CP-relaxations and on connections between SDP-, CP and SIP

Recovery of primal solutions from dual subgradient methods for mixed binary linear programming; a branch-and-bound approach

Solving polynomial least squares problems via semidefinite programming relaxations

Two-Stage Stochastic Linear Programs

MAP-Inference for Highly-Connected Graphs with DC-Programming

11. APPROXIMATION ALGORITHMS

Using the Theory of Reals in. Analyzing Continuous and Hybrid Systems

Zeros of a Polynomial Function

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. #-approximation algorithm.

Nonlinear Optimization: Algorithms 3: Interior-point methods

Definition Given a graph G on n vertices, we define the following quantities:

A QCQP Approach to Triangulation. Chris Aholt, Sameer Agarwal, and Rekha Thomas University of Washington 2 Google, Inc.

1 Introduction. Linear Programming. Questions. A general optimization problem is of the form: choose x to. max f(x) subject to x S. where.

Massive Data Classification via Unconstrained Support Vector Machines

Derivative Free Optimization

Chapter Load Balancing. Approximation Algorithms. Load Balancing. Load Balancing on 2 Machines. Load Balancing: Greedy Scheduling

Branch and Cut for TSP

8.1 Min Degree Spanning Tree

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

(67902) Topics in Theory and Complexity Nov 2, Lecture 7

Algorithm Design and Analysis

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. !-approximation algorithm.

Split Nonthreshold Laplacian Integral Graphs

Approximation Algorithms

Gossip Algorithms. Devavrat Shah MIT

4.6 Linear Programming duality

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Applied Algorithm Design Lecture 5

A Metaheuristic Optimization Algorithm for Binary Quadratic Problems

MA107 Precalculus Algebra Exam 2 Review Solutions

Dantzig-Wolfe bound and Dantzig-Wolfe cookbook

Optimization Modeling for Mining Engineers

Maximum Margin Clustering

Zeros of Polynomial Functions

Advanced Lecture on Mathematical Science and Information Science I. Optimization in Finance

Transportation Polytopes: a Twenty year Update

Math Assignment 6

LECTURE: INTRO TO LINEAR PROGRAMMING AND THE SIMPLEX METHOD, KEVIN ROSS MARCH 31, 2005

Some representability and duality results for convex mixed-integer programs.

Lecture 1: Schur s Unitary Triangularization Theorem

24. The Branch and Bound Method

CONSTRAINED NONLINEAR PROGRAMMING

Conic optimization: examples and software

IEOR 4404 Homework #2 Intro OR: Deterministic Models February 14, 2011 Prof. Jay Sethuraman Page 1 of 5. Homework #2

NP-Hardness Results Related to PPAD

Statistical Machine Learning

On the Unique Games Conjecture

Introduction to Matrix Algebra

How the European day-ahead electricity market works

Economics 326: Duality and the Slutsky Decomposition. Ethan Kaplan

Cyber-Security Analysis of State Estimators in Power Systems

Optimization Methods in Finance

SECOND DERIVATIVE TEST FOR CONSTRAINED EXTREMA

On the effect of forwarding table size on SDN network utilization

Integrating Benders decomposition within Constraint Programming

A characterization of trace zero symmetric nonnegative 5x5 matrices

Warshall s Algorithm: Transitive Closure

Linear Algebra I. Ronald van Luijk, 2012

3.1 Solving Systems Using Tables and Graphs

Bindel, Spring 2012 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Feb 8

Efficient and Robust Allocation Algorithms in Clouds under Memory Constraints

Minimizing the Number of Machines in a Unit-Time Scheduling Problem

Solutions Of Some Non-Linear Programming Problems BIJAN KUMAR PATEL. Master of Science in Mathematics. Prof. ANIL KUMAR

Logistic Regression. Jia Li. Department of Statistics The Pennsylvania State University. Logistic Regression

Similarity and Diagonalization. Similar Matrices

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Summer course on Convex Optimization. Fifth Lecture Interior-Point Methods (1) Michel Baes, K.U.Leuven Bharath Rangarajan, U.

October 3rd, Linear Algebra & Properties of the Covariance Matrix

Introduction to Support Vector Machines. Colin Campbell, Bristol University

JUST THE MATHS UNIT NUMBER 1.8. ALGEBRA 8 (Polynomials) A.J.Hobson

Transcription:

Lecture : - Quadratic Program and Lower Bounds (3 units) Outline Problem formulations Reformulation: Linearization & continuous relaxation Branch & Bound Method framework Simple bounds, LP bound and semidefinite relaxation Variable fixation / 25

Problem formulation Standard form with - variables: (-QP) min f (x) = x {,} n 2 x T Qx + c T x where Q is an n n symmetric matrix and c R n. Homogenous form: Binary variables: (-QP h ) (BQP) min x T Qx. x {,} n min x T Qx + c T x. x {,} n Transformation: x i = 2 (y i + ). Homogenous form with binary variables: (BQP h ) min x T Qx, x {,} n (BQP) ( (BQP h ) with ) Q := 2 ct, x := (±, x c Q T ) {, } n+. 2 / 25

Max-Cut problem Consider a graph G = (E, V ) with vertex set V = {,..., n} and edge set E = {ij i < j n}. For every edge ij E, there is an associated weight w ij. Cut: For a given set S V, a cut δ(s) is the set of all edges with one endpoint in S and the other in V \ S, and the weight of cut δ(s) is ij δ(s) w ij. Max-Cut: find a cut δ(s) with maximum weight. Binary quadratic problem: (Max-Cut) max 2 i<j n s.t. x {, } n. w ij ( x i x j ) x {, } n S = {i V x i = } and V \ S = {i V x i = }. 3 / 25

Linearization Method - Quadratic problem: (P) n min Q(x) = c i x i + x {,} n i= i<j n q ij x i x j. For x i, x j {, }, y ij = x i x j iff y ij = max{x i + x j, }, y ij {, }, or y ij = min{x i, x j }, y ij {, }. 4 / 25

(P) is equivalent to the following - linear integer program: n min c i x i + q ij y ij + q ij y ij x,y i= (i,j) I + (i,j) I s.t. y ij x i, y ij x j, (i, j) I (q ij < ) y ij x i + x j, (i, j) I + (q ij ), x i {, }, i =,..., n, y ij {, }, i < j n. A polynomially solvable case: q ij : n c i x i + q ij y ij min x,y i= i<j n s.t. y ij x i, i < j n y ij x j, i < j n x i, x j, y ij {, }, i < j n. The constraint matrix is totally unimodular! Can we use transformation: z i = x i for q ij >? 5 / 25

Continuous relaxation Consider the continuous relaxation of (P): ( P) n min Q(x) = c i x i + x [,] n i= i<j n q ij x i x j. Then, at least one of the optimal solutions of ( P) is located at an extreme point of [, ] n. Therefore v(p) = v( P). Unfortunately, the objective function of ( P) is nonconvex and nonconcave. Define n Q p (x) = c i x i + x T Qx px T x + pe T x. i= Q p (x) = Q(x) for x {, } n. For large p, Q p (x) is a concave function. Thus, (P) is equivalent to the concave minimization problem: (P c ) min x [,] n Q p(x) 6 / 25

Branch and Bound Framework Computing lower bound; Branching on x i = or x i = ; Fixing variable by certain optimality condition. 7 / 25

Basic lower bounding methods Simple lower bounds Continuous relaxation LP relaxation Lagrangian relaxation & SDP relaxation 8 / 25

Simple bounds Lower bound. Let Q(x) = n c i x i + 2 x T Qx = i= n c i x i + 2 x T Qx. i= An obvious lower bound of Q(x) over {, } n is: LB s = 2 n n min(q ij, ) + min(c i + 2 q ii, ). i= j i i= Lower bound 2. An improved simple lower bound is derived by noting that: since x, if Qx a, then 2 x T Qx 2 at x. Let Q i denote the ith row of Q. Then a i = min Q i x = min(q ij, ). x {,} n j i 9 / 25

So = min x {,} n 2 x T Qx + c T x min x {,} n( 2 a + c)t x n min{c i + 2 q ii + min(q ij, ), } 2 i= = LB 2 s. j i It is easy to show that LB 2 s is better than LB s, i.e., LB 2 s LB s. / 25

Continuous relaxation Since Q(x) = 2 x T (Q + diag(u))x + (c 2 u)t x takes the same value on {, } n as Q(x), it is natural to compute a lower bound via solving the continuous relaxation: ( P) β(u) = min x [,] n 2 x T (Q + diag(u))x + (c 2 u)t x. Some observations: If ui s are large enough, then Q = Q + diag(u) will be diagonally dominant (thus positive definite). u = λ min e is an obvious choice to make Q positive semidefinite (but not necessarily the best one). The optimal solution to ( P) tends to ( 2, 2,..., 2 )T as u i s are increased. / 25

Consider a small example of (P) where ( ) ( 3 Q =, c = 3 ). For this example, we have x = (, ) T with Q(x ) =. The two simple bounds for this problem are: LB s = 3 and LB 2 s =. The eigenvalues of Q is ( 2, 4). 2 / 25

.5.5.5.8.6.4.2.2.4.6.8 Figure: The figure of Q(x) over [, ] 2, which is nonconvex 3 / 25

.5.5.5.5.8.6.4.2.2.4.6.8 Figure: u = λ min e, x u = (.864, ) T, β(u) =.46. 4 / 25

2 2 3 4 5.8.6.4.2.2.4.6.8 Figure: u = 2e, x u = (.577,.5538) T, β(u) = 4.7769. 5 / 25

5 5 5 2 25.8.6.4.2.2.4.6.8 Figure: u = e, x u = (.53,.5) T, β(u) = 24.755. 6 / 25

Another way of choosing u is to find a u such that β(u ) = max{β(u) (Q diag(u)), u R n }. The above problem is equivalent to a semidefinite quadratic program which can be solved efficiently (polynomially). 7 / 25

LP relaxation The continuous relaxation of the - linearized problem is a linear program (y ij = x i x j ): min x,y i<j n q ij y ij + n (c i + 2 q ii)x i i= s.t. y ij x i, y ij x j, i < j n, q ij <, x i + x j y ij, i < j n, q ij >, x i, i =,..., n, y ij, i < j n. 8 / 25

SDP relaxation Consider Q(x) = 2 x T Qx + c T. The Lagrangian dual is: v(d) = max λ R n d(λ) = max λ R n x R n{ 2 x T [Q + 2diag(λ)]x + c T x e T λ} = max τ (λ,τ) R n+ s.t. 2 x T [Q + 2diag(λ)]x + c T x e T λ τ x R n. 9 / 25

2 x T [Q + 2diag(λ)]x + c T x e T λ τ, x R n ( (x T Q + 2diag(λ) c, t) c T 2τ 2e T λ (t, x) R n+, ) ( x t ) ( Q + 2diag(λ) c c T 2τ 2e T λ ), 2 / 25

Variable fixation Let x denote the optimal solution of (P). Let l i = c i + 2 q ii + j i min(, q ij ), u i = c i + 2 q ii + j i max(, q ij ). (i) If l i > then x i = ; (ii) If u i <, then x i =. 2 / 25

Fixing variable by outerbox Equivalent perturbed problem: (P µ ) min x {,} n f µ(x) = 2 x T (Q + µi )x + (c 2 µe)t x, where µ, I is the n n identity matrix and e = (,..., ) T. The shape of contour changes when µ changes. The properties of (P µ ), e.g., the conditional number, change when µ changes. 22 / 25

(,) (,) (,) (,) x 2 µ= µ=5 µ=3 2 µ= x (µ) 3 4 µ= 2 2 3 x 23 / 25

Ellipsoid contour: Let x {, } n. E µ = {x f µ (x) = f (x ). Let the minimum box that contains ellipsoid E µ be B = [a, b], where a = x s, b = x + s. Then Let x be an optimal solution to (P). Then (i) If b i < for some i, then x i = ; (ii) If a i > for some i, then x i =. 24 / 25

.5 (,) (,).5 (,) (,).5 x (µ) Eµ(ṽ) Bµ(ṽ).5.5.5.5.5 2 25 / 25