Optimization in R n Introduction



Similar documents
4.6 Linear Programming duality

2.3 Convex Constrained Optimization Problems

5 INTEGER LINEAR PROGRAMMING (ILP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

International Doctoral School Algorithmic Decision Theory: MCDA and MOO

The Equivalence of Linear Programs and Zero-Sum Games

Lecture 3. Linear Programming. 3B1B Optimization Michaelmas 2015 A. Zisserman. Extreme solutions. Simplex method. Interior point method

IEOR 4404 Homework #2 Intro OR: Deterministic Models February 14, 2011 Prof. Jay Sethuraman Page 1 of 5. Homework #2

Chapter 4. Duality. 4.1 A Graphical Example

1 Introduction. Linear Programming. Questions. A general optimization problem is of the form: choose x to. max f(x) subject to x S. where.

Proximal mapping via network optimization

LECTURE 5: DUALITY AND SENSITIVITY ANALYSIS. 1. Dual linear program 2. Duality theory 3. Sensitivity analysis 4. Dual simplex method

Mathematical finance and linear programming (optimization)

Linear Programming. Widget Factory Example. Linear Programming: Standard Form. Widget Factory Example: Continued.

Actually Doing It! 6. Prove that the regular unit cube (say 1cm=unit) of sufficiently high dimension can fit inside it the whole city of New York.

What is Linear Programming?

Linear Programming. March 14, 2014

3. Linear Programming and Polyhedral Combinatorics

NP-Hardness Results Related to PPAD

Some representability and duality results for convex mixed-integer programs.

Two-Stage Stochastic Linear Programs

Nonlinear Optimization: Algorithms 3: Interior-point methods

Lecture 1: Systems of Linear Equations

This exposition of linear programming

Linear Programming I

24. The Branch and Bound Method

. P. 4.3 Basic feasible solutions and vertices of polyhedra. x 1. x 2

constraint. Let us penalize ourselves for making the constraint too big. We end up with a

Linear Programming: Chapter 11 Game Theory

Chapter 6. Linear Programming: The Simplex Method. Introduction to the Big M Method. Section 4 Maximization and Minimization with Problem Constraints

Contents. What is Wirtschaftsmathematik?

Approximation Algorithms

26 Linear Programming

Date: April 12, Contents

1 Norms and Vector Spaces

LECTURE: INTRO TO LINEAR PROGRAMMING AND THE SIMPLEX METHOD, KEVIN ROSS MARCH 31, 2005

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Nonlinear Programming Methods.S2 Quadratic Programming

Optimization Modeling for Mining Engineers

Warshall s Algorithm: Transitive Closure

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Lecture 2: August 29. Linear Programming (part I)

9.4 THE SIMPLEX METHOD: MINIMIZATION

Linear Programming Notes V Problem Transformations

Duality in General Programs. Ryan Tibshirani Convex Optimization /36-725

Minimally Infeasible Set Partitioning Problems with Balanced Constraints

Lecture 5 Principal Minors and the Hessian

1 Linear Programming. 1.1 Introduction. Problem description: motivate by min-cost flow. bit of history. everything is LP. NP and conp. P breakthrough.

5.1 Bipartite Matching

Transportation Polytopes: a Twenty year Update

Cost Minimization and the Cost Function

We shall turn our attention to solving linear systems of equations. Ax = b

Homework # 3 Solutions

Dantzig-Wolfe bound and Dantzig-Wolfe cookbook

Similarity and Diagonalization. Similar Matrices

3.1 Solving Systems Using Tables and Graphs

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Numerisches Rechnen. (für Informatiker) M. Grepl J. Berger & J.T. Frings. Institut für Geometrie und Praktische Mathematik RWTH Aachen

Chapter 7. Matrices. Definition. An m n matrix is an array of numbers set out in m rows and n columns. Examples. (

Solutions Of Some Non-Linear Programming Problems BIJAN KUMAR PATEL. Master of Science in Mathematics. Prof. ANIL KUMAR

SECOND DERIVATIVE TEST FOR CONSTRAINED EXTREMA

160 CHAPTER 4. VECTOR SPACES

FUZZY CLUSTERING ANALYSIS OF DATA MINING: APPLICATION TO AN ACCIDENT MINING SYSTEM

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh

Chapter 5. Linear Inequalities and Linear Programming. Linear Programming in Two Dimensions: A Geometric Approach

Mathematics Course 111: Algebra I Part IV: Vector Spaces

The Multiplicative Weights Update method

Duality in Linear Programming

Integer Programming: Algorithms - 3

Linear Programming in Matrix Form

Lecture 2 Matrix Operations

Discuss the size of the instance for the minimum spanning tree problem.

Operation Research. Module 1. Module 2. Unit 1. Unit 2. Unit 3. Unit 1

Linear programming and reductions

(67902) Topics in Theory and Complexity Nov 2, Lecture 7

Discrete Optimization

3. Evaluate the objective function at each vertex. Put the vertices into a table: Vertex P=3x+2y (0, 0) 0 min (0, 5) 10 (15, 0) 45 (12, 2) 40 Max

1 Solving LPs: The Simplex Algorithm of George Dantzig

Derivative Free Optimization

Linear Programming for Optimization. Mark A. Schulze, Ph.D. Perceptive Scientific Instruments, Inc.

A NEW LOOK AT CONVEX ANALYSIS AND OPTIMIZATION

BX in ( u, v) basis in two ways. On the one hand, AN = u+

1 Lecture: Integration of rational functions by decomposition

Continued Fractions and the Euclidean Algorithm

Module1. x y 800.

Systems of Linear Equations

EXCEL SOLVER TUTORIAL

Lecture 4: AC 0 lower bounds and pseudorandomness

Inner products on R n, and more

5. Orthogonal matrices

Cyber-Security Analysis of State Estimators in Power Systems

Linear Programming. April 12, 2005

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Applied Algorithm Design Lecture 5

Linear Algebra Notes for Marsden and Tromba Vector Calculus

Chapter 6. Orthogonality

Practical Guide to the Simplex Method of Linear Programming

Lecture 4: Partitioned Matrices and Determinants

Transcription:

Optimization in R n Introduction Rudi Pendavingh Eindhoven Technical University Optimization in R n, lecture Rudi Pendavingh (TUE) Optimization in R n Introduction ORN / 4

Some optimization problems designing a cylindrical can containing liter, with minimum surface area designing a closed spatial object of volume, with minimum surface area finding the shortest path between two cities in a road network finding the shortest path between two cities in a landscape finding a maximum number of cards without a set in the SET game max{n Z x n + y n = z n for some x, y, z Z \ {,, } } Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 2 / 4

Optimization An optimization problem is of the form min{f (x) x S} where S is a set and f : S R is a function. S is the feasible set f : S R is the objective function if S =, then the problem is infeasible if inf{f (x) x S} =, the problem is unbounded if y S is such that f (y) = inf{f (x) x S}, then y is an optimal solution, f (y) is the optimum, then the optimum is attained by y. Solving the optimization problem means finding an optimal solution y, or deciding that the problem is infeasible, unbounded or that the optimum is not attained. Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 3 / 4

Constrained optimization in R n These problems are of the form So here the feasible set is Typical constraints are min{f (x) x R n satisfies constraints} S := {x R n x satisfies constraints }. inequality constraints: g(x) for some g : R n R. In particular, linear inequalities like 3x + 5x 2 + 2x n 7 equality constraints: h(x) = for some h : R n R. In particular, linear equations like 3x + 5x 2 + 2x n = 7 integrality constraints: x Z n. The entries x,..., x n R of x are called decision variables Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 4 / 4

Linear optimization A linear optimization problem has a linear objective and linear inequality constraints, e.g. max{cx Ax b, x R n } where A is a matrix, b a column vector, c a row vector. cx=d a a 2 c a 5 a 3 a4 Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 5 / 4

The Diet problem given: n foods items, m nutrients, with the j-th food item containing a ij units of nutrient i per unit. Required units of nutrient i: b i, Costs per unit of food j: c j. find: amounts x j of each food j so that the intake of each nutrient is sufficient and so that total costs are minimal. This is a linear optimization problem min{ j c j x j j a ij x j b i for all i, x } or min{cx Ax b, x }. Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 6 / 4

Integer linear optimization Integer linear optimization is like linear optimization, but with integrality constraints: max{cx Ax b, x Z n } a 3 a4 a 5 c cx=d a 2 a Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 7 / 4

The Knapsack problem given: n items; weight of i-th item :a i R; maximum weight b R; value of i-th item c i R find: set of items X {,..., n} of total weight b and maximizing the total value. This is an integer linear optimization problem max{ c i x i a i x i b, x i for each i, x Z n } i i or max{cx ax b, x, x Z n }. Note: i X x i = Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 8 / 4

Convex optimization A convex optimization problem is a problem of the form min{f (x) g (x),..., g n (x), x R n } where f, g i : R n R are convex functions f(x)=d S Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 9 / 4

Certifying optimality Consider the linear optimization problem max{cx Ax b, x R n }. If ˆx R n is a feasible solution, then ˆx is optimal max{cx Ax b, x R n } cˆx there is no x R n such that Ax b, cx > cˆx How to show that a system of linear inequalities has no solution? Rudi Pendavingh (TUE) Optimization in R n Introduction ORN / 4

Fredholms Alternative Theorem Let A be an m n matrix, and let b R m. Either: (i) there exists an column vector x R n so that Ax = b; or (ii) there exists a row vector y R m so that ya = and yb =. In other words, there are x,... x n R such that a x + + a n x n = b... a m x + + a mn x n = b m or there exist y,..., y m R such that y (a x + + a n x n = b ).... y m (a m x + + a mn x n = b m ) + x + + x n =. Rudi Pendavingh (TUE) Optimization in R n Introduction ORN / 4

Farkas Lemma Theorem Let A be an m n matrix, and let b R m. Either: (i) there exists a column vector x R n so that Ax b; or (ii) there exists a row vector y R m so that y, ya = and yb <. A variant of this theorem is Theorem Let A be an m n matrix, and let b R m. Either: (i) there exists a column vector x R n so that Ax = b and x ; or (ii) there exists a row vector y R m so that ya and yb <. Farkas Lemma is the key to understanding linear optimization problems. Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 2 / 4

Kroneckers Approximation Theorem The following theorem deals with diophantine linear equations. Theorem Let A be an m n rational matrix, and let b Q m. Then either (i) there exists a column vector x Z n such that Ax = b; or (ii) there exists a row vector y Q m such that ya Z n and yb Z. To handle integer linear optimization problems, we would need a similar theorem for diophantine linear inequalities. Such a theorem is known only for very special matrices A. Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 3 / 4

Homework, AIMMS Weekly schedule, homework: http://www.win.tue.nl/ rudi/2wo.html To download AIMMS: login: aimms password: AIMMS for TU/e Rudi Pendavingh (TUE) Optimization in R n Introduction ORN 4 / 4