# Bayesian networks. Bayesian Networks. Example. Example. Chapter 14 Section 1, 2, 4

Save this PDF as:

Size: px
Start display at page:

Download "Bayesian networks. Bayesian Networks. Example. Example. Chapter 14 Section 1, 2, 4"

## Transcription

1 Bayesian networks Bayesian Networks Chapter 14 Section 1, 2, 4 A simple, graphical notation for conditional independence assertions and hence for compact specification of full joint distributions Syntax: a set of nodes, one per variable a directed, acyclic graph (link "directly influences") if there is a link from x to y, x is said to be a parent of y a conditional distribution for each node given its parents: P (X i Parents (X i )) In the simplest case, conditional distribution represented as a conditional probability table (CP) giving the distribution over X i for each combination of parent values opology of network encodes conditional independence assertions: I'm at work, neighbor John calls to say my alarm is ringing, but neighbor Mary doesn't call. Sometimes it's set off by minor earthquakes. Is there a burglar? Variables:,,,, Weather is independent of the other variables oothache and Catch are conditionally independent given Cavity Network topology reflects "causal" knowledge: A burglar can set the alarm off An earthquake can set the alarm off he alarm can cause Mary to call he alarm can cause John to call contd. Compactness A CP for Boolean X i with k Boolean parents has 2 k rows for the combinations of parent values Each row requires one number p for X i = true (the number for X i = false is just 1-p) If each variable has no more than k parents, the complete network requires O(n 2 k ) numbers I.e., grows linearly with n, vs. O(2 n ) for the full joint distribution or burglary net, = 10 numbers (vs = 31) 1

2 Semantics he full joint distribution is defined as the product of the local conditional distributions: n P (X 1,,X n ) = π i = 1 P (X i Parents(X i )) hus each entry in the joint distribution is represented by the product of the appropriate elements of the conditional probability tables in the Bayesian network. e.g., P(j^m ^a ^ b ^ e) = P(j a) P(m a) P(a b, e) P( b) P( e) = 0.90 * 0.70 * * * = Back to the dentist example... We now represent the world of the dentist D using three propositions Cavity, oothache, and PCatch D s belief state consists of 2 3 = 8 states each with some probability: {^^, ^^, ^ ^,...} he belief state is defined by the full joint probability of the propositions Probabilistic Inference P( ν ) = = 0.28 Probabilistic Inference Probabilistic Inference P() = = 0.2 Marginalization: P (c) = Σ t Σ pc P(c^t^pc) using the conventions that c = or and that Σ t is the sum over t = {, } 2

3 Conditional Probability P(A^B) = P(A B) = P(B A) P(A) P(A B) is the posterior probability of A given B P( ) = P(^)/P() = (+)/(+++) = 0.6 Interpretation: After observing oothache, the patient is no longer an average one, and the prior probabilities of Cavity is no longer valid P( ) is calculated by keeping the ratios of the probabilities of the 4 cases unchanged, and normalizing their sum to 1 P( ) = P(^)/P() = (+)/(+++) = 0.6 P( )=P( ^)/P() = (+)/(+++) = 0.4 P(C toochache) normalization constant = α P(C ^ ) = α Σ pc P(C ^ ^ pc) = α [(, ) + (, )] = α (0.12, 0.08) = (0.6, 0.4) Conditional Probability P(A^B) = P(A B) = P(B A) P(A) P(A^B^C) = P(A B,C) P(B^C) = P(A B,C) P(B C) P(C) P(Cavity) = Σ t Σ pc P(Cavity^t^pc) = Σ t Σ pc P(Cavity t,pc) P(t^pc) P(c) = Σ t Σ pc P(c^t^pc) = Σ t Σ pc P(c t,pc)p(t^pc) Independence wo random variables A and B are independent if P(A^B) = P(A) hence if P(A B) = P(A) wo random variables A and B are independent given C, if P(A^B C) = P(A C) P(B C) hence if P(A B,C) = P(A C) Issues If a state is described by n propositions, then a belief state contains 2 n states (possibly, some have probability 0) Modeling difficulty: many numbers must be entered in the first place Computational issue: memory size and time 3

4 Bayesian Network and are independent given (or ), but this relation is hidden in the numbers! [Verify this] Bayesian networks explicitly represent independence among propositions to reduce the number of probabilities defining a belief state Notice that Cavity is the cause of both oothache and PCatch, and represent the causality links explicitly Give the prior probability distribution of Cavity Give the conditional probability tables of oothache and PCatch P( c) oothache Cavity P() probabilities, instead of 7 PCatch P(pclass c) A More Complex BN A More Complex BN Intuitive meaning of arc from x to y: x has direct influence on y Directed acyclic graph causes effects Size of the CP for a node with k parents: 2 k probabilities, instead of 31 What does the BN encode? What does the BN encode? Each of the beliefs and is independent of and given or or example, John does not observe any burglaries directly he beliefs and are independent given or A node is is independent of its non-descendants given its parents or instance, the reasons why John and Mary may not call if there is an alarm are unrelated 4

5 Locally Structured World A world is locally structured (or sparse) if each of its components interacts directly with relatively few other components In a sparse world, the CPs are small and the BN contains many fewer probabilities than the full joint distribution If the # of entries in each CP is bounded, i.e., O(1), then the # of probabilities in a BN is linear in n the # of propositions instead of 2 n for the joint distribution But does a BN represent a belief state? In other words, can we compute the full joint distribution of the propositions from it? Calculation of Joint Probability P(j^m^a^ b ^e) =?? P(J^M^A^ B^ E) = P(J^M A, B, E) * P(A^ B^ E) = P(J A, B, E) * P(M A, B, E) * P(A^ B^ E) (J and M are independent given A) P(J A, B, E) = P(J A) (J and B^ E are independent given A) P(M A, B, E) = P(M A) P(A^ B^ E) = P(A B, E) * P( B E) * P( E) = P(A B, E) * P( B) * P( E) ( B and E are independent) P(J^M^A^ B^ E) = P(J A)P(M A)P(A B, E)P( B)P( E) Calculation of Joint Probability Calculation of Joint Probability P(J^M^A^ B^ E) = P(J A)P(M A)P(A B, E)P( B)P( E) = 0.9 x 0.7 x x x = P(J^M^A^ B^ E) = P(J A)P(M A)P(A B, E)P( B)P( E) = 0.9 x 0.7 x x x = P(x 1^x 2^ ^x n ) 0.90 = Π i=1,,n P(x i parents(x i )) 0.01 full joint distribution table 5

6 Calculation of Joint Probability Querying the BN P(J^M^A^ B^ E) = P(J A)P(M A)P(A B, E)P( B)P( E) = 0.9 x 0.7 x x x = Since a BN defines the full joint distribution of a set of propositions, it represents a belief state Cavity oothache P(C) 0.1 C P( c) he BN gives P(t c) What about P(c t)? P( t) = P( ^ t)/p(t) = P(t ) P() / P(t) [Bayes rule] P(c t) = α P(t c) P(c) P(x 1^x 2^ ^x n ) 0.90 = Π i=1,,n P(x i parents(x i )) 0.01 full joint distribution table Querying a BN is just applying the trivial Bayes rule on a larger scale Exact Inference in Bayesian Networks Let s generalize that last example a little suppose we are given that and are both true, what is the probability distribution for? P( = true, =true) Look back at using full joint distribution for this purpose summing over hidden variables. Inference by enumeration (example in the text book) figure 14.8 P(X e) = α P (X, e) = α y P(X, e, y) P(B j,m) = αp(b,j,m) = α e a P(B,e,a,j,m) P(b j,m) = α e a P(b)P(e)P(a be)p(j a)p(m a) P(b j,m) = α P(b) e P(e) a P(a be)p(j a)p(m a) P(B j,m) = α < , 4919> P(B j,m) <0.284, 0.716> Inference by enumeration (another way of looking at it) figure 14.8 P(X e) = α P (X, e) = α y P(X, e, y) See Handout for Another P(B j,m) = αp(b,j,m) = α e a P(B,e,a,j,m) P(b j,m) = P(B,e,a,j,m) + P(B,e, a,j,m) + P(B, e,a,j,m) + P(B, e, a,j,m) P(B j,m) = α < , 4919> P(B j,m) <0.284, 0.716> 6

7 Singly Connected BN Multiply Connected BN A BN is singly connected if there is at most one undirected path between any two nodes A BN is multiply connected if there is more than one undirected path between a pair of nodes is singly connected is multiply connected Multiply Connected BN A BN is multiply connected if there is more than one undirected path between a pair of nodes Querying a multiply-connected BN takes time exponential in in the total is multiply number connected of CP entries in in the worst case Constructing Bayesian networks 1. Choose an ordering of variables X 1,,X n such that root causes are first in the order, then the variables that they influence, and so forth. 2. or i = 1 to n add X i to the network select parents from X 1,,X i-1 such that P (X i Parents(X i )) = P (X i X 1,... X i-1 ) Note:the parents n of a node are all of the nodes that influence it. In this way, each node is conditionally independent of its predecessors in the n order, given its parents. his choice of parents guarantees: P (X 1,,X n ) = π i =1 P (X i X 1,, X i-1 ) (chain rule) = π i =1 P (X i Parents(X i )) (by construction) How important is the ordering? P(J M) = P (J)? P(J M) = P (J)? No P(A J, M) = P(A J)? P(A J, M) = P(A)? 7

8 P(J M) = P (J)? No P(A J, M) = P(A J)? P(A J, M) = P(A)? No P(B A, J, M) = P(B A)? P(B A, J, M) =? P(J M) = P (J)? No P(A J, M) = P(A J)? P(A J, M) = P(A)? No P(B A, J, M) = P(B A)? Yes P(B A, J, M) =? No P(E B, A,J, M) = P(E A)? P(E B, A, J, M) = P(E A, B)? contd. P(J M) = P (J)? No P(A J, M) = P(A J)? P(A J, M) = P(A)? No P(B A, J, M) = P(B A)? Yes P(B A, J, M) =? No P(E B, A,J, M) = P(E A)? No P(E B, A, J, M) = P(E A, B)? Yes Deciding conditional independence is hard in noncausal directions (Causal models and conditional independence seem hardwired for humans!) Network is less compact: = 13 numbers needed Summary Bayesian networks provide a natural representation for (causally induced) conditional independence opology + CPs = compact representation of joint distribution Generally easy for domain experts to construct 8

### Bayesian Networks Chapter 14. Mausam (Slides by UW-AI faculty & David Page)

Bayesian Networks Chapter 14 Mausam (Slides by UW-AI faculty & David Page) Bayes Nets In general, joint distribution P over set of variables (X 1 x... x X n ) requires exponential space for representation

### 13.3 Inference Using Full Joint Distribution

191 The probability distribution on a single variable must sum to 1 It is also true that any joint probability distribution on any set of variables must sum to 1 Recall that any proposition a is equivalent

### Bayesian Networks. Mausam (Slides by UW-AI faculty)

Bayesian Networks Mausam (Slides by UW-AI faculty) Bayes Nets In general, joint distribution P over set of variables (X 1 x... x X n ) requires exponential space for representation & inference BNs provide

### Probability, Conditional Independence

Probability, Conditional Independence June 19, 2012 Probability, Conditional Independence Probability Sample space Ω of events Each event ω Ω has an associated measure Probability of the event P(ω) Axioms

### Artificial Intelligence. Conditional probability. Inference by enumeration. Independence. Lesson 11 (From Russell & Norvig)

Artificial Intelligence Conditional probability Conditional or posterior probabilities e.g., cavity toothache) = 0.8 i.e., given that toothache is all I know tation for conditional distributions: Cavity

### CS 188: Artificial Intelligence. Probability recap

CS 188: Artificial Intelligence Bayes Nets Representation and Independence Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Conditional probability

### Bayesian Networks. Read R&N Ch. 14.1-14.2. Next lecture: Read R&N 18.1-18.4

Bayesian Networks Read R&N Ch. 14.1-14.2 Next lecture: Read R&N 18.1-18.4 You will be expected to know Basic concepts and vocabulary of Bayesian networks. Nodes represent random variables. Directed arcs

### Lecture 2: Introduction to belief (Bayesian) networks

Lecture 2: Introduction to belief (Bayesian) networks Conditional independence What is a belief network? Independence maps (I-maps) January 7, 2008 1 COMP-526 Lecture 2 Recall from last time: Conditional

### Marginal Independence and Conditional Independence

Marginal Independence and Conditional Independence Computer Science cpsc322, Lecture 26 (Textbook Chpt 6.1-2) March, 19, 2010 Recap with Example Marginalization Conditional Probability Chain Rule Lecture

### Relational Dynamic Bayesian Networks: a report. Cristina Manfredotti

Relational Dynamic Bayesian Networks: a report Cristina Manfredotti Dipartimento di Informatica, Sistemistica e Comunicazione (D.I.S.Co.) Università degli Studi Milano-Bicocca manfredotti@disco.unimib.it

### Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1

Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2011 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields

### 3. The Junction Tree Algorithms

A Short Course on Graphical Models 3. The Junction Tree Algorithms Mark Paskin mark@paskin.org 1 Review: conditional independence Two random variables X and Y are independent (written X Y ) iff p X ( )

### The Basics of Graphical Models

The Basics of Graphical Models David M. Blei Columbia University October 3, 2015 Introduction These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. Many figures

### Probabilistic Graphical Models

Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 4, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 4, 2011 1 / 22 Bayesian Networks and independences

### Decision Trees and Networks

Lecture 21: Uncertainty 6 Today s Lecture Victor R. Lesser CMPSCI 683 Fall 2010 Decision Trees and Networks Decision Trees A decision tree is an explicit representation of all the possible scenarios from

### Intelligent Systems: Reasoning and Recognition. Uncertainty and Plausible Reasoning

Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2015/2016 Lesson 12 25 march 2015 Uncertainty and Plausible Reasoning MYCIN (continued)...2 Backward

### Life of A Knowledge Base (KB)

Life of A Knowledge Base (KB) A knowledge base system is a special kind of database management system to for knowledge base management. KB extraction: knowledge extraction using statistical models in NLP/ML

### Querying Joint Probability Distributions

Querying Joint Probability Distributions Sargur Srihari srihari@cedar.buffalo.edu 1 Queries of Interest Probabilistic Graphical Models (BNs and MNs) represent joint probability distributions over multiple

### Probabilistic Networks An Introduction to Bayesian Networks and Influence Diagrams

Probabilistic Networks An Introduction to Bayesian Networks and Influence Diagrams Uffe B. Kjærulff Department of Computer Science Aalborg University Anders L. Madsen HUGIN Expert A/S 10 May 2005 2 Contents

### Bayesian Tutorial (Sheet Updated 20 March)

Bayesian Tutorial (Sheet Updated 20 March) Practice Questions (for discussing in Class) Week starting 21 March 2016 1. What is the probability that the total of two dice will be greater than 8, given that

### Course: Model, Learning, and Inference: Lecture 5

Course: Model, Learning, and Inference: Lecture 5 Alan Yuille Department of Statistics, UCLA Los Angeles, CA 90095 yuille@stat.ucla.edu Abstract Probability distributions on structured representation.

### Pooling and Meta-analysis. Tony O Hagan

Pooling and Meta-analysis Tony O Hagan Pooling Synthesising prior information from several experts 2 Multiple experts The case of multiple experts is important When elicitation is used to provide expert

### Compression algorithm for Bayesian network modeling of binary systems

Compression algorithm for Bayesian network modeling of binary systems I. Tien & A. Der Kiureghian University of California, Berkeley ABSTRACT: A Bayesian network (BN) is a useful tool for analyzing the

### Study Manual. Probabilistic Reasoning. Silja Renooij. August 2015

Study Manual Probabilistic Reasoning 2015 2016 Silja Renooij August 2015 General information This study manual was designed to help guide your self studies. As such, it does not include material that is

### Model-based Synthesis. Tony O Hagan

Model-based Synthesis Tony O Hagan Stochastic models Synthesising evidence through a statistical model 2 Evidence Synthesis (Session 3), Helsinki, 28/10/11 Graphical modelling The kinds of models that

### Mining Social Network Graphs

Mining Social Network Graphs Debapriyo Majumdar Data Mining Fall 2014 Indian Statistical Institute Kolkata November 13, 17, 2014 Social Network No introduc+on required Really? We s7ll need to understand

### Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

1 Learning Goals Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1. Be able to apply Bayes theorem to compute probabilities. 2. Be able to identify

10-601 Machine Learning http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html Course data All up-to-date info is on the course web page: http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html

### Extending the Role of Causality in Probabilistic Modeling

Extending the Role of Causality in Probabilistic Modeling Joost Vennekens, Marc Denecker, Maurice Bruynooghe {joost, marcd, maurice}@cs.kuleuven.be K.U. Leuven, Belgium Causality Causality is central concept

### An Empirical Evaluation of Bayesian Networks Derived from Fault Trees

An Empirical Evaluation of Bayesian Networks Derived from Fault Trees Shane Strasser and John Sheppard Department of Computer Science Montana State University Bozeman, MT 59717 {shane.strasser, john.sheppard}@cs.montana.edu

### I I I I I I I I I I I I I I I I I I I

A ABSTRACT Method for Using Belief Networks as nfluence Diagrams Gregory F. Cooper Medical Computer Science Group Medical School Office Building Stanford University Stanford, CA 94305 This paper demonstrates

### 5 Directed acyclic graphs

5 Directed acyclic graphs (5.1) Introduction In many statistical studies we have prior knowledge about a temporal or causal ordering of the variables. In this chapter we will use directed graphs to incorporate

### STA 371G: Statistics and Modeling

STA 371G: Statistics and Modeling Decision Making Under Uncertainty: Probability, Betting Odds and Bayes Theorem Mingyuan Zhou McCombs School of Business The University of Texas at Austin http://mingyuanzhou.github.io/sta371g

### Probability OPRE 6301

Probability OPRE 6301 Random Experiment... Recall that our eventual goal in this course is to go from the random sample to the population. The theory that allows for this transition is the theory of probability.

### Big Data, Machine Learning, Causal Models

Big Data, Machine Learning, Causal Models Sargur N. Srihari University at Buffalo, The State University of New York USA Int. Conf. on Signal and Image Processing, Bangalore January 2014 1 Plan of Discussion

### m (t) = e nt m Y ( t) = e nt (pe t + q) n = (pe t e t + qe t ) n = (qe t + p) n

1. For a discrete random variable Y, prove that E[aY + b] = ae[y] + b and V(aY + b) = a 2 V(Y). Solution: E[aY + b] = E[aY] + E[b] = ae[y] + b where each step follows from a theorem on expected value from

### (Refer Slide Time: 2:03)

Control Engineering Prof. Madan Gopal Department of Electrical Engineering Indian Institute of Technology, Delhi Lecture - 11 Models of Industrial Control Devices and Systems (Contd.) Last time we were

### A Probabilistic Causal Model for Diagnosis of Liver Disorders

Intelligent Information Systems VII Proceedings of the Workshop held in Malbork, Poland, June 15-19, 1998 A Probabilistic Causal Model for Diagnosis of Liver Disorders Agnieszka Oniśko 1, Marek J. Druzdzel

### The Certainty-Factor Model

The Certainty-Factor Model David Heckerman Departments of Computer Science and Pathology University of Southern California HMR 204, 2025 Zonal Ave Los Angeles, CA 90033 dheck@sumex-aim.stanford.edu 1 Introduction

### Message-passing sequential detection of multiple change points in networks

Message-passing sequential detection of multiple change points in networks Long Nguyen, Arash Amini Ram Rajagopal University of Michigan Stanford University ISIT, Boston, July 2012 Nguyen/Amini/Rajagopal

### A Bayesian Approach for on-line max auditing of Dynamic Statistical Databases

A Bayesian Approach for on-line max auditing of Dynamic Statistical Databases Gerardo Canfora Bice Cavallo University of Sannio, Benevento, Italy, {gerardo.canfora,bice.cavallo}@unisannio.it ABSTRACT In

### 4. Conditional Probability P( ) CSE 312 Winter 2011 W.L. Ruzzo

4. Conditional Probability P( ) CSE 312 Winter 2011 W.L. Ruzzo conditional probability Conditional probability of E given F: probability that E occurs given that F has already occurred. Conditioning on

### Chapter 28. Bayesian Networks

Chapter 28. Bayesian Networks The Quest for Artificial Intelligence, Nilsson, N. J., 2009. Lecture Notes on Artificial Intelligence, Spring 2012 Summarized by Kim, Byoung-Hee and Lim, Byoung-Kwon Biointelligence

### Probability for AI students who have seen some probability before but didn t understand any of it

Probability for AI students who have seen some probability before but didn t understand any of it I am inflicting these proofs on you for two reasons: 1. These kind of manipulations will need to be second

### SYSM 6304: Risk and Decision Analysis Lecture 5: Methods of Risk Analysis

SYSM 6304: Risk and Decision Analysis Lecture 5: Methods of Risk Analysis M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu October 17, 2015 Outline

### Krishna Institute of Engineering & Technology, Ghaziabad Department of Computer Application MCA-213 : DATA STRUCTURES USING C

Tutorial#1 Q 1:- Explain the terms data, elementary item, entity, primary key, domain, attribute and information? Also give examples in support of your answer? Q 2:- What is a Data Type? Differentiate

### A Few Basics of Probability

A Few Basics of Probability Philosophy 57 Spring, 2004 1 Introduction This handout distinguishes between inductive and deductive logic, and then introduces probability, a concept essential to the study

### Lecture 1: Course overview, circuits, and formulas

Lecture 1: Course overview, circuits, and formulas Topics in Complexity Theory and Pseudorandomness (Spring 2013) Rutgers University Swastik Kopparty Scribes: John Kim, Ben Lund 1 Course Information Swastik

### SEQUENTIAL VALUATION NETWORKS: A NEW GRAPHICAL TECHNIQUE FOR ASYMMETRIC DECISION PROBLEMS

SEQUENTIAL VALUATION NETWORKS: A NEW GRAPHICAL TECHNIQUE FOR ASYMMETRIC DECISION PROBLEMS Riza Demirer and Prakash P. Shenoy University of Kansas, School of Business 1300 Sunnyside Ave, Summerfield Hall

### Review the following from Chapter 5

Bluman, Chapter 6 1 Review the following from Chapter 5 A surgical procedure has an 85% chance of success and a doctor performs the procedure on 10 patients, find the following: a) The probability that

### CMPSCI611: Approximating MAX-CUT Lecture 20

CMPSCI611: Approximating MAX-CUT Lecture 20 For the next two lectures we ll be seeing examples of approximation algorithms for interesting NP-hard problems. Today we consider MAX-CUT, which we proved to

### The Joint Probability Distribution (JPD) of a set of n binary variables involve a huge number of parameters

DEFINING PROILISTI MODELS The Joint Probability Distribution (JPD) of a set of n binary variables involve a huge number of parameters 2 n (larger than 10 25 for only 100 variables). x y z p(x, y, z) 0

### A Bayesian Network Model for Diagnosis of Liver Disorders Agnieszka Onisko, M.S., 1,2 Marek J. Druzdzel, Ph.D., 1 and Hanna Wasyluk, M.D.,Ph.D.

Research Report CBMI-99-27, Center for Biomedical Informatics, University of Pittsburgh, September 1999 A Bayesian Network Model for Diagnosis of Liver Disorders Agnieszka Onisko, M.S., 1,2 Marek J. Druzdzel,

### Part 2: Community Detection

Chapter 8: Graph Data Part 2: Community Detection Based on Leskovec, Rajaraman, Ullman 2014: Mining of Massive Datasets Big Data Management and Analytics Outline Community Detection - Social networks -

### Reliability. 26.1 Reliability Models. Chapter 26 Page 1

Chapter 26 Page 1 Reliability Although the technological achievements of the last 50 years can hardly be disputed, there is one weakness in all mankind's devices. That is the possibility of failure. What

### Causal Independence for Probability Assessment and Inference Using Bayesian Networks

Causal Independence for Probability Assessment and Inference Using Bayesian Networks David Heckerman and John S. Breese Copyright c 1993 IEEE. Reprinted from D. Heckerman, J. Breese. Causal Independence

### Towards running complex models on big data

Towards running complex models on big data Working with all the genomes in the world without changing the model (too much) Daniel Lawson Heilbronn Institute, University of Bristol 2013 1 / 17 Motivation

### The rule for computing conditional property can be interpreted different. In Question 2, P B

Question 4: What is the product rule for probability? The rule for computing conditional property can be interpreted different. In Question 2, P A and B we defined the conditional probability PA B. If

### Simplifying Bayesian Inference

Simplifying Bayesian Inference Stefan Krauß, Laura Martignon & Ulrich Hoffrage Max Planck Institute For Human Development Lentzeallee 94, 14195 Berlin-Dahlem Probability theory can be used to model inference

### Discuss the size of the instance for the minimum spanning tree problem.

3.1 Algorithm complexity The algorithms A, B are given. The former has complexity O(n 2 ), the latter O(2 n ), where n is the size of the instance. Let n A 0 be the size of the largest instance that can

### Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note 11

CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According

### IE 680 Special Topics in Production Systems: Networks, Routing and Logistics*

IE 680 Special Topics in Production Systems: Networks, Routing and Logistics* Rakesh Nagi Department of Industrial Engineering University at Buffalo (SUNY) *Lecture notes from Network Flows by Ahuja, Magnanti

### CS570 Introduction to Data Mining. Classification and Prediction. Partial slide credits: Han and Kamber Tan,Steinbach, Kumar

CS570 Introduction to Data Mining Classification and Prediction Partial slide credits: Han and Kamber Tan,Steinbach, Kumar 1 Classification and Prediction Overview Classification algorithms and methods

### Bayesian Networks and Classifiers in Project Management

Bayesian Networks and Classifiers in Project Management Daniel Rodríguez 1, Javier Dolado 2 and Manoranjan Satpathy 1 1 Dept. of Computer Science The University of Reading Reading, RG6 6AY, UK drg@ieee.org,

### Data Structures in Java. Session 15 Instructor: Bert Huang http://www1.cs.columbia.edu/~bert/courses/3134

Data Structures in Java Session 15 Instructor: Bert Huang http://www1.cs.columbia.edu/~bert/courses/3134 Announcements Homework 4 on website No class on Tuesday Midterm grades almost done Review Indexing

### CORRELATED TO THE SOUTH CAROLINA COLLEGE AND CAREER-READY FOUNDATIONS IN ALGEBRA

We Can Early Learning Curriculum PreK Grades 8 12 INSIDE ALGEBRA, GRADES 8 12 CORRELATED TO THE SOUTH CAROLINA COLLEGE AND CAREER-READY FOUNDATIONS IN ALGEBRA April 2016 www.voyagersopris.com Mathematical

### Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams

Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams André Ciré University of Toronto John Hooker Carnegie Mellon University INFORMS 2014 Home Health Care Home health care delivery

### Neural Networks. CAP5610 Machine Learning Instructor: Guo-Jun Qi

Neural Networks CAP5610 Machine Learning Instructor: Guo-Jun Qi Recap: linear classifier Logistic regression Maximizing the posterior distribution of class Y conditional on the input vector X Support vector

### Bayesian models of cognition

Bayesian models of cognition Thomas L. Griffiths Department of Psychology University of California, Berkeley Charles Kemp and Joshua B. Tenenbaum Department of Brain and Cognitive Sciences Massachusetts

### Induction. Margaret M. Fleck. 10 October These notes cover mathematical induction and recursive definition

Induction Margaret M. Fleck 10 October 011 These notes cover mathematical induction and recursive definition 1 Introduction to induction At the start of the term, we saw the following formula for computing

### Math 202-0 Quizzes Winter 2009

Quiz : Basic Probability Ten Scrabble tiles are placed in a bag Four of the tiles have the letter printed on them, and there are two tiles each with the letters B, C and D on them (a) Suppose one tile

### CHAPTER 2 Estimating Probabilities

CHAPTER 2 Estimating Probabilities Machine Learning Copyright c 2016. Tom M. Mitchell. All rights reserved. *DRAFT OF January 24, 2016* *PLEASE DO NOT DISTRIBUTE WITHOUT AUTHOR S PERMISSION* This is a

### DETERMINING THE CONDITIONAL PROBABILITIES IN BAYESIAN NETWORKS

Hacettepe Journal of Mathematics and Statistics Volume 33 (2004), 69 76 DETERMINING THE CONDITIONAL PROBABILITIES IN BAYESIAN NETWORKS Hülya Olmuş and S. Oral Erbaş Received 22 : 07 : 2003 : Accepted 04

### So let us begin our quest to find the holy grail of real analysis.

1 Section 5.2 The Complete Ordered Field: Purpose of Section We present an axiomatic description of the real numbers as a complete ordered field. The axioms which describe the arithmetic of the real numbers

### INTRODUCTION TO NEURAL NETWORKS

INTRODUCTION TO NEURAL NETWORKS Pictures are taken from http://www.cs.cmu.edu/~tom/mlbook-chapter-slides.html http://research.microsoft.com/~cmbishop/prml/index.htm By Nobel Khandaker Neural Networks An

### Bayesian networks - Time-series models - Apache Spark & Scala

Bayesian networks - Time-series models - Apache Spark & Scala Dr John Sandiford, CTO Bayes Server Data Science London Meetup - November 2014 1 Contents Introduction Bayesian networks Latent variables Anomaly

### Mathematical Induction. Lecture 10-11

Mathematical Induction Lecture 10-11 Menu Mathematical Induction Strong Induction Recursive Definitions Structural Induction Climbing an Infinite Ladder Suppose we have an infinite ladder: 1. We can reach

### Estimating Software Reliability In the Absence of Data

Estimating Software Reliability In the Absence of Data Joanne Bechta Dugan (jbd@virginia.edu) Ganesh J. Pai (gpai@virginia.edu) Department of ECE University of Virginia, Charlottesville, VA NASA OSMA SAS

### CAUSALITY: MODELS, REASONING, AND INFERENCE by Judea Pearl Cambridge University Press, 2000

Econometric Theory, 19, 2003, 675 685+ Printed in the United States of America+ DOI: 10+10170S0266466603004109 CAUSALITY: MODELS, REASONING, AND INFERENCE by Judea Pearl Cambridge University Press, 2000

### Asking Hard Graph Questions. Paul Burkhardt. February 3, 2014

Beyond Watson: Predictive Analytics and Big Data U.S. National Security Agency Research Directorate - R6 Technical Report February 3, 2014 300 years before Watson there was Euler! The first (Jeopardy!)

### Data Structure [Question Bank]

Unit I (Analysis of Algorithms) 1. What are algorithms and how they are useful? 2. Describe the factor on best algorithms depends on? 3. Differentiate: Correct & Incorrect Algorithms? 4. Write short note:

### Stat260: Bayesian Modeling and Inference Lecture Date: February 1, Lecture 3

Stat26: Bayesian Modeling and Inference Lecture Date: February 1, 21 Lecture 3 Lecturer: Michael I. Jordan Scribe: Joshua G. Schraiber 1 Decision theory Recall that decision theory provides a quantification

### Chapter 14 Managing Operational Risks with Bayesian Networks

Chapter 14 Managing Operational Risks with Bayesian Networks Carol Alexander This chapter introduces Bayesian belief and decision networks as quantitative management tools for operational risks. Bayesian

### Simple Decision Making

Decision Making Under Uncertainty Ch 16 Simple Decision Making Many environments have multiple possible outcomes Some of these outcomes may be good; others may be bad Some may be very likely; others unlikely

### Chapter 15: Transactions

Chapter 15: Transactions Database System Concepts, 5th Ed. See www.db book.com for conditions on re use Chapter 15: Transactions Transaction Concept Transaction State Concurrent Executions Serializability

### Lesson 22: Solution Sets to Simultaneous Equations

Student Outcomes Students identify solutions to simultaneous equations or inequalities; they solve systems of linear equations and inequalities either algebraically or graphically. Classwork Opening Exercise

### Homework #2 Solutions

MAT Fall 0 Homework # Solutions Problems Section.: 8, 0, 6, 0, 8, 0 Section.:, 0, 8,, 4, 8..8. Find the relative, or percent, change in W if it changes from 0. to 0.0. Solution: The percent change is R

### Stock Investing Using HUGIN Software

Stock Investing Using HUGIN Software An Easy Way to Use Quantitative Investment Techniques Abstract Quantitative investment methods have gained foothold in the financial world in the last ten years. This

### The Conquest of U.S. Inflation: Learning and Robustness to Model Uncertainty

The Conquest of U.S. Inflation: Learning and Robustness to Model Uncertainty Timothy Cogley and Thomas Sargent University of California, Davis and New York University and Hoover Institution Conquest of

### CSE 326, Data Structures. Sample Final Exam. Problem Max Points Score 1 14 (2x7) 2 18 (3x6) 3 4 4 7 5 9 6 16 7 8 8 4 9 8 10 4 Total 92.

Name: Email ID: CSE 326, Data Structures Section: Sample Final Exam Instructions: The exam is closed book, closed notes. Unless otherwise stated, N denotes the number of elements in the data structure

### Confindence Intervals and Probability Testing

Confindence Intervals and Probability Testing PO7001: Quantitative Methods I Kenneth Benoit 3 November 2010 Using probability distributions to assess sample likelihoods Recall that using the µ and σ from

### CONTINGENCY (CROSS- TABULATION) TABLES

CONTINGENCY (CROSS- TABULATION) TABLES Presents counts of two or more variables A 1 A 2 Total B 1 a b a+b B 2 c d c+d Total a+c b+d n = a+b+c+d 1 Joint, Marginal, and Conditional Probability We study methods

### MAS108 Probability I

1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

### Final Project Report

CPSC545 by Introduction to Data Mining Prof. Martin Schultz & Prof. Mark Gerstein Student Name: Yu Kor Hugo Lam Student ID : 904907866 Due Date : May 7, 2007 Introduction Final Project Report Pseudogenes

Computing and Informatics, Vol. 20, 2001,????, V 2006-Nov-6 UPDATES OF LOGIC PROGRAMS Ján Šefránek Department of Applied Informatics, Faculty of Mathematics, Physics and Informatics, Comenius University,

### Statistical Inference. Prof. Kate Calder. If the coin is fair (chance of heads = chance of tails) then

Probability Statistical Inference Question: How often would this method give the correct answer if I used it many times? Answer: Use laws of probability. 1 Example: Tossing a coin If the coin is fair (chance

### Join Bayes Nets: A New Type of Bayes net for Relational Data

Join Bayes Nets: A New Type of Bayes net for Relational Data Bayes nets, statistical-relational learning, knowledge discovery in databases Abstract Many real-world data are maintained in relational format,