Reasoning Under Uncertainty: Variable Elimination Example



Similar documents
Section Composite and Inverse Functions

1 if 1 x 0 1 if 0 x 1

Click on the links below to jump directly to the relevant section

2.1 Increasing, Decreasing, and Piecewise Functions; Applications

1. Then f has a relative maximum at x = c if f(c) f(x) for all values of x in some

Average rate of change of y = f(x) with respect to x as x changes from a to a + h:

Continuity. DEFINITION 1: A function f is continuous at a number a if. lim

Lecture Notes on Database Normalization

Lecture 10: Regression Trees

Mining Social Network Graphs

Critical points of once continuously differentiable functions are important because they are the only points that can be local maxima or minima.

Operations and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

The Basics of Graphical Models

ALGEBRA 2: 4.1 Graph Quadratic Functions in Standard Form

Mathematics 31 Pre-calculus and Limits

Boolean Algebra Part 1

1 Lecture: Integration of rational functions by decomposition

To define function and introduce operations on the set of functions. To investigate which of the field properties hold in the set of functions

1.4. Arithmetic of Algebraic Fractions. Introduction. Prerequisites. Learning Outcomes

Theory behind Normalization & DB Design. Satisfiability: Does an FD hold? Lecture 12

5.1 Derivatives and Graphs

Bayesian Networks. Read R&N Ch Next lecture: Read R&N

Lecture 7: NP-Complete Problems

Lectures 5-6: Taylor Series

Partial Fractions. p(x) q(x)

Slope-Intercept Equation. Example

MPE Review Section III: Logarithmic & Exponential Functions

Databases -Normalization III. (N Spadaccini 2010 and W Liu 2012) Databases - Normalization III 1 / 31

MA 1125 Lecture 14 - Expected Values. Friday, February 28, Objectives: Introduce expected values.

Week 2: Exponential Functions

Relational Database Design: FD s & BCNF

Decision Trees and Networks

Logic Programs for Consistently Querying Data Integration Systems

Part 2: Community Detection

Section 2.7 One-to-One Functions and Their Inverses

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

Lecture 3 : The Natural Exponential Function: f(x) = exp(x) = e x. y = exp(x) if and only if x = ln(y)

Solve addition and subtraction word problems, and add and subtract within 10, e.g., by using objects or drawings to represent the problem.

Study Guide 2 Solutions MATH 111

Tim Kerins. Leaving Certificate Honours Maths - Algebra. Tim Kerins. the date

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

SYSM 6304: Risk and Decision Analysis Lecture 5: Methods of Risk Analysis

COMP 250 Fall 2012 lecture 2 binary representations Sept. 11, 2012

(Refer Slide Time: 2:03)

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

2x 2x 2 8x. Now, let s work backwards to FACTOR. We begin by placing the terms of the polynomial inside the cells of the box. 2x 2

The Mean Value Theorem

MAC Sublayer. Abusayeed Saifullah. CS 5600 Computer Networks. These slides are adapted from Kurose and Ross

3. The Junction Tree Algorithms

6.1 Add & Subtract Polynomial Expression & Functions

Math 4310 Handout - Quotient Vector Spaces

Lagrange Interpolation is a method of fitting an equation to a set of points that functions well when there are few points given.

MOP 2007 Black Group Integer Polynomials Yufei Zhao. Integer Polynomials. June 29, 2007 Yufei Zhao

Algorithm Design and Analysis

IF The customer should receive priority service THEN Call within 4 hours PCAI 16.4

Section 6.1 Factoring Expressions

CHAPTER 3. Methods of Proofs. 1. Logical Arguments and Formal Proofs

Some Polynomial Theorems. John Kennedy Mathematics Department Santa Monica College 1900 Pico Blvd. Santa Monica, CA

WEEK #22: PDFs and CDFs, Measures of Center and Spread

Bayesian Networks. Mausam (Slides by UW-AI faculty)

Lecture 1: Course overview, circuits, and formulas

Algebra 1 Course Information

INTRODUCTORY SET THEORY

63. Graph y 1 2 x and y 2 THE FACTOR THEOREM. The Factor Theorem. Consider the polynomial function. P(x) x 2 2x 15.

1.3 Polynomials and Factoring

COSC344 Database Theory and Applications. Lecture 9 Normalisation. COSC344 Lecture 9 1

The Envelope Theorem 1

Computational Geometry Lab: FEM BASIS FUNCTIONS FOR A TETRAHEDRON

Kevin James. MTHSC 102 Section 1.5 Exponential Functions and Models

Database Design and Normalization

Factoring Trinomials: The ac Method

36 CHAPTER 1. LIMITS AND CONTINUITY. Figure 1.17: At which points is f not continuous?

Big Data, Machine Learning, Causal Models

3 Extending the Refinement Calculus

Jordan University of Science & Technology Computer Science Department CS 728: Advanced Database Systems Midterm Exam First 2009/2010

Practice with Proofs

Chapter 7: Relational Database Design

Copy in your notebook: Add an example of each term with the symbols used in algebra 2 if there are any.

Relational Normalization Theory (supplemental material)

Two Fundamental Theorems about the Definite Integral

SOLUTIONS. f x = 6x 2 6xy 24x, f y = 3x 2 6y. To find the critical points, we solve

Factoring Guidelines. Greatest Common Factor Two Terms Three Terms Four Terms Shirley Radai

Partial Fractions Examples

The Method of Partial Fractions Math 121 Calculus II Spring 2015

Mathematics Online Instructional Materials Correlation to the 2009 Algebra I Standards of Learning and Curriculum Framework

So today we shall continue our discussion on the search engines and web crawlers. (Refer Slide Time: 01:02)

Factoring Polynomials

Unit 7: Radical Functions & Rational Exponents

Relational Calculus. Module 3, Lecture 2. Database Management Systems, R. Ramakrishnan 1

MATH 90 CHAPTER 1 Name:.

Factorization in Polynomial Rings

Chapter 4, Arithmetic in F [x] Polynomial arithmetic and the division algorithm.

Long-Run Average Cost. Econ 410: Micro Theory. Long-Run Average Cost. Long-Run Average Cost. Economies of Scale & Scope Minimizing Cost Mathematically

Lecture 7: Continuous Random Variables

2 Integrating Both Sides

Basics of Statistical Machine Learning

Transcription:

Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29 March 26, 2007 Textbook 9.5 Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 1

Lecture Overview 1 Recap 2 More Variable Elimination 3 Variable Elimination Example Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 2

Summing out variables Our second operation: we can sum out a variable, say X 1 with domain {v 1,..., v k }, from factor f(x 1,..., X j ), resulting in a factor on X 2,..., X j defined by: ( X 1 f)(x 2,..., X j ) = f(x 1 = v 1,..., X j ) + + f(x 1 = v k,..., X j ) Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 3

Multiplying factors Our third operation: factors can be multiplied together. The product of factor f 1 (X, Y ) and f 2 (Y, Z), where Y are the variables in common, is the factor (f 1 f 2 )(X, Y, Z) defined by: (f 1 f 2 )(X, Y, Z) = f 1 (X, Y )f 2 (Y, Z). Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 4

Probability of a conjunction Suppose the variables of the belief network are X 1,..., X n. {X 1,..., X n } = {Q} {Y 1,..., Y j } {Z 1,..., k } Q is the query variable {Y 1,..., Y j } is the set of observed variables {Z 1,..., k } is the rest of the variables, that are neither queried nor observed What we want to compute: P (Q, Y 1 = v 1,..., Y j = v j ) We can compute P (Q, Y 1 = v 1,..., Y j = v j ) by summing out each variable Z 1,..., Z k We sum out these variables one at a time the order in which we do this is called our elimination ordering. P (Q, Y 1 = v 1,..., Y j = v j ) = Z k Z 1 P (X 1,..., X n ) Y1 = v 1,...,Y j = v j. Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 5

Probability of a conjunction What we know: the factors P (X i px i ). Using the chain rule and the definition of a belief network, we can write P (X 1,..., X n ) as n i=1 P (X i px i ). Thus: P (Q, Y 1 = v 1,..., Y j = v j ) = Z k P (X 1,..., X n ) Y1 = v 1,...,Y j = v j. Z 1 = n P (X i px i ) Y1 = v 1,...,Y j = v j. Z k Z 1 i=1 Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 6

Computing sums of products Computation in belief networks thus reduces to computing the sums of products. It takes 14 multiplications or additions to evaluate the expression ab + ac + ad + aeh + afh + agh. How can this expression be evaluated more efficiently? factor out the a and then the h giving a(b + c + d + h(e + f + g)) this takes only 7 multiplications or additions ( How can we compute n Z 1 i=1 P (X i px i ) efficiently? Factor out those terms that don t involve Z 1 : )( ) P (X i px i ) P (X i px i ) i Z 1 {X i} px i (terms that do not involve Z i ) Z 1 i Z 1 {X i} px i (terms that involve Z i ) Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 7

Lecture Overview 1 Recap 2 More Variable Elimination 3 Variable Elimination Example Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 8

Summing out a variable efficiently To sum out a variable Z j from a product f 1,..., f k of factors: Partition the factors into those that don t contain Z j, say f 1,..., f i, those that contain Z j, say f i+1,..., f k We know: f 1 f k = (f 1 f i ) f i+1 f k. Z j Zj ( ) f Zj i+1 f k is a new factor; let s call it f. Now we have: Z j f 1 f k = f 1 f i f. Store f explicitly, and discard f i+1,..., f k. Now we ve summed out Z j. Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 9

Variable elimination algorithm To compute P (Q Y 1 = v 1... Y j = v j ): Construct a factor for each conditional probability. Set the observed variables to their observed values. For each of the other variables Z i {Z 1,..., Z k }, sum out Z i Multiply the remaining factors. Normalize by dividing the resulting factor f(q) by Q f(q). Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 10

Lecture Overview 1 Recap 2 More Variable Elimination 3 Variable Elimination Example Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 11

Variable elimination example A Compute P (G H = h 1 ). Elimination order: A, C, E, H, I, B, D, F P P (G, H) = A,B,C,D,E,F,I P (A, B, C, D, E, F, G, H, I) P P (G, H) = A,B,C,D,E,F,I P (A) P (B A) P (C) P (D B, C) P (E C) P (F D) P (G F, E) P (H G) P (I G) B D F C E G H I Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 12

Variable elimination example Compute P (G H = h 1 ). Elimination order: A, C, E, H, I, B, D, F P P (G, H) = A,B,C,D,E,F,I P (A) P (B A) P (C) P (D B, C) P (E C) P (F D) P (G F, E) P (H G) P (I G) P Eliminate A: P (G, H) = B,C,D,E,F,I f1(b) P (C) P (D B, C) P (E C) P (F D) P (G F, E) P (H G) P (I G) A P f 1(B) := a dom(a) P (A = a) P (B A = a) B D F C E G H I Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 12

Variable elimination example Compute P (G H P = h 1 ). Elimination order: A, C, E, H, I, B, D, F P (G, H) = B,C,D,E,F,I f1(b) P (C) P (D B, C) P (E C) P (F D) P (G F, E) P (H G) P (I G) Eliminate C: P (G, H) = P B,D,E,F,I f1(b) f2(b, D, E) P (F D) P (G F, E) P (H G) P (I G) A B D F C E P f 1(B) := a dom(a) P P (A = a) P (B A = a) f 2(B, D, E) := c dom(c) P (C = c) P (D B, C = c) P (E C = c) G H I Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 12

Variable elimination example Compute P (G H = h 1 ). Elimination order: A, C, E, H, I, B, D, F P(G, H) = f1(b) f2(b, D, E) P (F D) P (G F, E) P (H G) P (I G) B,D,E,F,I Eliminate E: P P (G, H) = B,D,F,I f1(b) f3(b, D, F, G) P (F D) P (H G) P (I G) A B D F C E P f 1(B) := a dom(a) P P (A = a) P (B A = a) f 2(B, D, E) := c dom(c) P P (C = c) P (D B, C = c) P (E C = c) f 3(B, D, F, G) := e dom(e) f2(b, D, E = e) P (G F, E = e) G H I Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 12

Variable elimination example Compute P (G H = h 1 ). Elimination order: A, C, E, H, I, B, D, F P P (G, H) = B,D,F,I f1(b) f3(b, D, F, G) P (F D) P (H G) P (I G) Observe H = h 1: P P (G, H = h 1) = B,D,F,I f1(b) f3(b, D, F, G) P (F D) f4(g) P (I G) A B D F C E P f 1(B) := a dom(a) P P (A = a) P (B A = a) f 2(B, D, E) := c dom(c) P P (C = c) P (D B, C = c) P (E C = c) f 3(B, D, F, G) := e dom(e) f2(b, D, E = e) P (G F, E = e) f 4(G) := P (H = h 1 G) G H I Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 12

Variable elimination example Compute P (G H = h 1 ). Elimination order: A, C, E, H, I, B, D, F P P (G, H = h 1) = B,D,F,I f1(b) f3(b, D, F, G) P (F D) f4(g) P (I G) Eliminate I: P P (G, H = h 1) = B,D,F f1(b) f3(b, D, F, G) P (F D) f4(g) f5(g) A B D F C E P f 1(B) := a dom(a) P P (A = a) P (B A = a) f 2(B, D, E) := c dom(c) P P (C = c) P (D B, C = c) P (E C = c) f 3(B, D, F, G) := e dom(e) f2(b, D, E = e) P (G F, E = e) f 4(G) := P (H = h 1 G) P f 5(G) := i dom(i) P (I = i G) G H I Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 12

Variable elimination example Compute P (G H = P h 1 ). Elimination order: A, C, E, H, I, B, D, F P (G, H = h 1) = B,D,F f1(b) f3(b, D, F, G) P (F D) f4(g) f5(g) Eliminate B: P P (G, H = h 1) = D,F f6(d, F, G) P (F D) f4(g) f5(g) A B D F C E P f 1(B) := a dom(a) P P (A = a) P (B A = a) f 2(B, D, E) := c dom(c) P P (C = c) P (D B, C = c) P (E C = c) f 3(B, D, F, G) := e dom(e) f2(b, D, E = e) P (G F, E = e) f 4(G) := P (H = h 1 G) P f 5(G) := i dom(i) P P (I = i G) f 6(D, F, G) := b dom(b) f1(b = b) f3(b = b, D, F, G) G H I Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 12

Variable elimination example Compute P (G H = P h 1 ). Elimination order: A, C, E, H, I, B, D, F P (G, H = h 1) = D,F f6(d, F, P G) P (F D) f4(g) f5(g) Eliminate D: P (G, H = h 1) = F f7(f, G) f4(g) f5(g) A B D F H G C E I P f 1(B) := a dom(a) P P (A = a) P (B A = a) f 2(B, D, E) := c dom(c) P P (C = c) P (D B, C = c) P (E C = c) f 3(B, D, F, G) := e dom(e) f2(b, D, E = e) P (G F, E = e) f 4(G) := P (H = h 1 G) P f 5(G) := i dom(i) P P (I = i G) f 6(D, F, G) := P b dom(b) f1(b = b) f3(b = b, D, F, G) f 7(F, G) := d dom(d) f6(d = d, F, G) P (F D = d) Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 12

Variable elimination example Compute P (G H = P h 1 ). Elimination order: A, C, E, H, I, B, D, F P (G, H = h 1) = F f7(f, G) f4(g) f5(g) Eliminate F : P (G, H = h 1) = f 8(G) f 4(G) f 5(G) A B D F H G C E I P f 1(B) := a dom(a) P P (A = a) P (B A = a) f 2(B, D, E) := c dom(c) P P (C = c) P (D B, C = c) P (E C = c) f 3(B, D, F, G) := e dom(e) f2(b, D, E = e) P (G F, E = e) f 4(G) := P (H = h 1 G) P f 5(G) := i dom(i) P P (I = i G) f 6(D, F, G) := P b dom(b) f1(b = b) f3(b = b, D, F, G) f 7(F, G) := d dom(d) f6(d = d, F, G) P (F D = d) P f 8(G) := f dom(f ) f7(f = f, G) Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 12

Variable elimination example Compute P (G H = h 1 ). Elimination order: A, C, E, H, I, B, D, F P (G, H = h 1) = f 8(G) f 4(G) f 5(G) Normalize: P (G H = h 1) = P (G,H = h 1 ) Pg dom(g) P (G,H = h 1) A B D F H G C E I P f 1(B) := a dom(a) P P (A = a) P (B A = a) f 2(B, D, E) := c dom(c) P P (C = c) P (D B, C = c) P (E C = c) f 3(B, D, F, G) := e dom(e) f2(b, D, E = e) P (G F, E = e) f 4(G) := P (H = h 1 G) P f 5(G) := i dom(i) P P (I = i G) f 6(D, F, G) := P b dom(b) f1(b = b) f3(b = b, D, F, G) f 7(F, G) := d dom(d) f6(d = d, F, G) P (F D = d) P f 8(G) := f dom(f ) f7(f = f, G) Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 12

What good was Conditional Independence? That s great... but it looks incredibly painful for large graphs. And... why did we bother learning conditional independence? Does it help us at all? Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 13

What good was Conditional Independence? That s great... but it looks incredibly painful for large graphs. And... why did we bother learning conditional independence? Does it help us at all? yes we use the chain rule decomposition right at the beginning Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 13

What good was Conditional Independence? That s great... but it looks incredibly painful for large graphs. And... why did we bother learning conditional independence? Does it help us at all? yes we use the chain rule decomposition right at the beginning Can we use our knowledge of conditional independence to make this calculation even simpler? Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 13

What good was Conditional Independence? That s great... but it looks incredibly painful for large graphs. And... why did we bother learning conditional independence? Does it help us at all? yes we use the chain rule decomposition right at the beginning Can we use our knowledge of conditional independence to make this calculation even simpler? yes there are some variables that we don t have to sum out intuitively, they re the ones that are pre-summed-out in our tables example: summing out I on the previous slide Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 13

One Last Trick One last trick to simplify calculations: we can repeatedly eliminate all leaf nodes that are neither observed nor queried, until we reach a fixed point. Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 14

One Last Trick One last trick to simplify calculations: we can repeatedly eliminate all leaf nodes that are neither observed nor queried, until we reach a fixed point. alarm fire smoke Can we justify that on a threenode graph Fire, Alarm, and Smoke when we ask for: P (F ire)? Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 14

One Last Trick One last trick to simplify calculations: we can repeatedly eliminate all leaf nodes that are neither observed nor queried, until we reach a fixed point. alarm fire smoke Can we justify that on a threenode graph Fire, Alarm, and Smoke when we ask for: P (F ire)? P (F ire Alarm)? Reasoning Under Uncertainty: Variable Elimination Example CPSC 322 Lecture 29, Slide 14