STAT 535 Lecture 4 Graphical representations of conditional independence Part II Bayesian Networks c Marina Meilă


 Curtis Hubbard
 1 years ago
 Views:
Transcription
1 TT 535 Lecture 4 Graphical representations of conditional independence Part II Bayesian Networks c Marina Meilă 1 limitation of Markov networks Between two variables or subsets of V, there can be four combinations of independence statements. Conditional Marginal Independence Independence B B B C B C C B B C Not representable by MRF B C The table above gives an example of MRF structure for each of the three cases that are representable by MRF s. For the fourth case, an Imap is possible, and this is the graph that contains an edge B. Note that this Imap is trivial, since it does not represent the independence B. This is a limitation of MRF s. We will show how serious this limitation is by an example. uppose that a variable called burglar alarm () becoming true can have two causes: a burglary (B) or an earthquake (E). reasonable assumption is that burglaries and earthquakes occur independently (B E). But, given that the alarm sounds = 1, and hearing that an earthquake as taken place 1
2 (E = 1), most people will believe that the burglary is less likely to have taken place than if there had been no earthquake (E = 0), hence E carries information about B when = 1 and therefore B, E are dependent given. In other words, it is reasonable to assume that Markov nets cannot represent this situation. B E (1) B E. (2) Exercise Find other examples of the kind (i) Y and Y Z, i.e two independent causes which can produce the same effect. Find examples that fit the other three possible combinations of marginal and conditional (in)depependence, ie. (ii) Y, Y Z, (iii) Y, Y Z, (iv) Y, Y Z. Can you describe them in words? The case illustrated above is important enough to warrant the introduction of another formalism for encoding independencies in graphs, called  separation and based on directed graphs. 2 irected cyclic Graphs (G s) irected cyclic Graph (Gs) is a directed graph G = (V, E) which contains no directed cycles. this is a G this is not a G Below is a somewhat more complicated textbook example (the chestclinic example). In this example, the arrows coincide with causal links between 2
3 sia moker Tuberculosis Lung cancer Bronchitis ray yspnoea Figure 1: The Chest clinic G. variables (i.e moking causes Lung cancer ). This is not completely accidental. Bayesian networks are particularly fit for representind domains where there are causal relationships. It is therefore useful to think of causal relationships when we try to build a Bayes net that represents a problem. But the formalism we study here is not necessarily tied to causality. One does not have to interpret the arrows as causeeffect relations and we will not do so. Terminology: parent sia is parent of Tuberculosis pa(variable) the set of parents of a variable pa( ray ) = { Lung cancer, Tuberculosis } child Lung cancer is child of moker ancestor moker is ancestor of yspnoea descendent family yspnoea is descendent of moker a node and its parents {yspnoea, Tuberculosis, Lung cancer, Bronchitis} are a family But perhaps the most important concept in G s is the Vstructure, which denotes a variable having two parents which are not connected by an edge. The BurglarEarthquakelarm example of the previous section is the Vstructure. 3
4 Burglar (B) Earthquake (E) Landslide (L) Earthquake (E) larm () larm () a Vstructure not a Vstructure In figure 1, (T,, L), (T,, L), (T,, B) are Vstructures. 3 separation In a G, independence is encoded by by the relation of dseparation, define below. B C dseparated from B by C separation : is dseparated from B by C if all the paths between sets and B are blocked by elements of C. The three cases of dseparation: Z Y Z Y Y Z if Z C the path is blocked, otherwise open if Z C the path is blocked, otherwise open Z if Z or one of its descendents C the path is open, otherwise blocke The directed Markov property: its nondescendants pa() 4
5 Theorem For any G G = (V, E) and any probability distribution P V over V, if P V satisfies the irected Markov Property w.r.t the G G, then any  separation relationship that is true in G corresponds to a true independence relationship in P V. (In other words, G is an Imap of P V ). 3.1 The Markov blanket In a G, the Markov blanket of a node is the union of all parents, children and other parents of s children. Markov blanket() = pa() ch() ( Y ch() pa(y ) \ {} ) For example, in the G in Figure 1, the Markov blanket of L is the set { (parent),, (children), T, B (parents of the children)}, and the Markov blanket of is T. 3.2 Equivalent G s Two G s are said to be (likelihood) equivalent if they encode the same set of independencies. For example, the two graphs below are equivalent (encoding no independencies). B B If the arrows represented causal links, then the two above graphs would not be equivalent! nother example of equivalence between Gs was seen in lecture 3: a Markov chain or a HMM can be represented by a directed graph with forward arrows, as well as by one with backward arrows. (Note, in passing, that an HMM can also be represented as an undirected graph, demonstrating equivalence between a G and a MRF). Yet another example is below. On the left, the chest clinic G. In the 5
6 middle, an equivalent G. On the right, another G which is not equivalent with the chest clinic example. [Exercise: verify that the graphs are/are not equivalent by looking at what independence relationships hold in the three graphs.] G 1 G 2 G 3 G 1 G 2 G 3 Theorem 1 (Chickering) Two G s are equivalent iff they have the same undirected skeleton and the same Vstructures. Consequently, we can invert an arrow in a G and preserve the same independencies, only if that arrow is not part of a Vstructure. In other words, B can be reversed iff for any arc C pointing at, there is an arc CB pointing into B and viceversa. Or, pa(b) = {} pa(). It can be proved that one can traverse the class of all equivalent G s by successive arrow reversals separation as separation in an undirected graph Here we show that separation in a G is equivalent to separation in an undirected graph obtained from the G and the variables we are interested in. First two definitions, whose meaning will become clear shortly. Moralization is the graph operation of connecting the parents of a V structure. G is moralized if all nodes that share a child have been 6
7 connected. fter a graph is moralized, all edges, be they original edges or new edges added by moralization, are considered as undirected. If G is a G the graph obtained by moralizing G is denoted by G m and is called the moral graph of G. marrying the parents dropping the directions For any variable the set an() denotes the ancestors of (including itself). imilarly, if is a set of nodes, an() denotes the set of all ancestors of variables in. an() = an() The ancestral graph of a set of nodes V is the graph G an() = (an(), E ) obtained from G by removing all nodes not in an(). Now we can state the main result. Theorem 2 Let, B, V be three disjoint sets of nodes in a G G. Then, B are separated by in G iff they are separated by in the moral ancestral graph of, B,. B in G iff B in (G an( B ) ) m The intuition is that observing/conditioning on a variable creates a dependence between its parents (if it has any). Moralization represents this link. Now why the ancestral graph? Note that an unobserved descendent cannot produce dependencies between its ancestors (ie cannot open a path in a directed graph). o we can safely remove all descendents of, B that are not 7
8 in. The descendents of itself that are not in, B, and all the nodes that are not ancestors of, B, can be removed by a similar reasoning. Hence, first the graph is pruned, then dependencies between parents are added by moralization. Now directions on edges can be removed, because G s are just like undirected graphs if it weren t for the Vstructures, and we have already dealt with those. The Theorem immediately suggests an algorithm for testing separation using undirected graph separation. 1. remove all nodes not in an( B ) to get G an( B ) 2. moralize the remaining graph to get (G an( B ) ) m 3. remove all nodes in from (G an( B ) ) m to get G 4. test if there is a path between and B in G For example, test if B in the chest clinic G. 8
9 T L B T L B show nodes of interest ancestral graph T L B T L B moralize eliminate conditioning nodes 4 Factorization Now we construct joint probability distributions that have the independencies specified by a given G. ssume the set of discrete variables is V = { 1, 2,... n } and that we are given a G G = (V, E). The goal is to construct the family of distributions that are represented by the graph. This family is given by n P( 1, 2,... n ) = P( i pa( i )) (3) In the above P( i pa( i )) represents the conditional distribution of variable i given its parents. Because the factors P( i pa( i )) involve a variable 9 i=1
10 Figure 2: The chest clinic G, with shorter variable names. and its parents, that is, nodes closely connected in the graph structure, we often call them local probability tables (or local distributions). Note that the parameters of each local table are (functionally) independent of the parameters in the other tables. We can choose them separately, and the set of all parameters for all conditional probability distributions form the family of distributions for which the G G is an Imap. If a distribution can be written in the form (3) we say that the distribution factors according to the graph G. joint distributions that factors according to some G G is called a Bayes net. Note that any distribution is a Bayes net in a trivial way: by taking G to be the complete graph, with no missing edges. In general, we want a Bayes net to be as sparse as possible, because representing independences explicitly has many computational advantages. The Bayes net described by this graph is P(,, T, L, B,, ) = P()P()P(T )P(L )P(B)P( T, L)P( T, L, B) way of obtaining this decomposition starting from the graph is 1. Construct a topological ordering of the variables. topological or 10
11 dering is an ordering of the variables where the parents of each variable are always before the variable itself in the ordering.,, T, L, B,, is a topological ordering for the graph above. This ordering is not unique; another possible topological ordering is, B, T,, L,,. In general, there can be exponentially many topological orderings for a given G. 2. pply the chain rule following the topological ordering. P(,, T, L, B,, ) = P()P( )P(T, )P(L,, T)P(B,, T, L) P(,, T, L, B)P(,, T, L, B, ) 3. Use the directed Markov property to simplify the factors P( ) = P() P(T, ) = P(T ) P(L,, T) = P(L ) P(B,, T, L) = P(B), etc. Let us now look at the number of parameters in such a model. ssume that in the example above all variables are binary. Then the number of unconstrained parameters in the model is = 19 The number of parameters in a 7 way contingency table is = 127 so we are saving 118 parameters. s we shall see, there are also other computational advantages to joint distribution representations of this form. 11
CS 188: Artificial Intelligence. Probability recap
CS 188: Artificial Intelligence Bayes Nets Representation and Independence Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Conditional probability
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 4, 2011 Raquel Urtasun and Tamir Hazan (TTIC) Graphical Models April 4, 2011 1 / 22 Bayesian Networks and independences
More informationProbability, Conditional Independence
Probability, Conditional Independence June 19, 2012 Probability, Conditional Independence Probability Sample space Ω of events Each event ω Ω has an associated measure Probability of the event P(ω) Axioms
More informationThe Basics of Graphical Models
The Basics of Graphical Models David M. Blei Columbia University October 3, 2015 Introduction These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. Many figures
More information5 Directed acyclic graphs
5 Directed acyclic graphs (5.1) Introduction In many statistical studies we have prior knowledge about a temporal or causal ordering of the variables. In this chapter we will use directed graphs to incorporate
More informationBayesian Networks Chapter 14. Mausam (Slides by UWAI faculty & David Page)
Bayesian Networks Chapter 14 Mausam (Slides by UWAI faculty & David Page) Bayes Nets In general, joint distribution P over set of variables (X 1 x... x X n ) requires exponential space for representation
More information13.3 Inference Using Full Joint Distribution
191 The probability distribution on a single variable must sum to 1 It is also true that any joint probability distribution on any set of variables must sum to 1 Recall that any proposition a is equivalent
More informationBayesian Networks. Mausam (Slides by UWAI faculty)
Bayesian Networks Mausam (Slides by UWAI faculty) Bayes Nets In general, joint distribution P over set of variables (X 1 x... x X n ) requires exponential space for representation & inference BNs provide
More informationBayesian Networks. Read R&N Ch. 14.114.2. Next lecture: Read R&N 18.118.4
Bayesian Networks Read R&N Ch. 14.114.2 Next lecture: Read R&N 18.118.4 You will be expected to know Basic concepts and vocabulary of Bayesian networks. Nodes represent random variables. Directed arcs
More informationArtificial Intelligence. Conditional probability. Inference by enumeration. Independence. Lesson 11 (From Russell & Norvig)
Artificial Intelligence Conditional probability Conditional or posterior probabilities e.g., cavity toothache) = 0.8 i.e., given that toothache is all I know tation for conditional distributions: Cavity
More informationLogic, Probability and Learning
Logic, Probability and Learning Luc De Raedt luc.deraedt@cs.kuleuven.be Overview Logic Learning Probabilistic Learning Probabilistic Logic Learning Closely following : Russell and Norvig, AI: a modern
More informationLecture 2: Introduction to belief (Bayesian) networks
Lecture 2: Introduction to belief (Bayesian) networks Conditional independence What is a belief network? Independence maps (Imaps) January 7, 2008 1 COMP526 Lecture 2 Recall from last time: Conditional
More information3. The Junction Tree Algorithms
A Short Course on Graphical Models 3. The Junction Tree Algorithms Mark Paskin mark@paskin.org 1 Review: conditional independence Two random variables X and Y are independent (written X Y ) iff p X ( )
More informationCourse: Model, Learning, and Inference: Lecture 5
Course: Model, Learning, and Inference: Lecture 5 Alan Yuille Department of Statistics, UCLA Los Angeles, CA 90095 yuille@stat.ucla.edu Abstract Probability distributions on structured representation.
More informationThe Joint Probability Distribution (JPD) of a set of n binary variables involve a huge number of parameters
DEFINING PROILISTI MODELS The Joint Probability Distribution (JPD) of a set of n binary variables involve a huge number of parameters 2 n (larger than 10 25 for only 100 variables). x y z p(x, y, z) 0
More informationDecision Trees and Networks
Lecture 21: Uncertainty 6 Today s Lecture Victor R. Lesser CMPSCI 683 Fall 2010 Decision Trees and Networks Decision Trees A decision tree is an explicit representation of all the possible scenarios from
More informationBayesian Tutorial (Sheet Updated 20 March)
Bayesian Tutorial (Sheet Updated 20 March) Practice Questions (for discussing in Class) Week starting 21 March 2016 1. What is the probability that the total of two dice will be greater than 8, given that
More informationLargeSample Learning of Bayesian Networks is NPHard
Journal of Machine Learning Research 5 (2004) 1287 1330 Submitted 3/04; Published 10/04 LargeSample Learning of Bayesian Networks is NPHard David Maxwell Chickering David Heckerman Christopher Meek Microsoft
More informationTechnical Report No. 5 April 18, Bayesian Networks. Michal Horný
Technical Report No. 5 April 18, 2014 Bayesian Networks Michal Horný mhorny@bu.edu This paper was published in fulfillment of the requirements for PM931: Directed Study in Health Policy and Management
More informationAn Introduction to the Use of Bayesian Network to Analyze Gene Expression Data
n Introduction to the Use of ayesian Network to nalyze Gene Expression Data Cristina Manfredotti Dipartimento di Informatica, Sistemistica e Comunicazione (D.I.S.Co. Università degli Studi Milanoicocca
More informationLecture 16 : Relations and Functions DRAFT
CS/Math 240: Introduction to Discrete Mathematics 3/29/2011 Lecture 16 : Relations and Functions Instructor: Dieter van Melkebeek Scribe: Dalibor Zelený DRAFT In Lecture 3, we described a correspondence
More informationScheduling. Open shop, job shop, flow shop scheduling. Related problems. Open shop, job shop, flow shop scheduling
Scheduling Basic scheduling problems: open shop, job shop, flow job The disjunctive graph representation Algorithms for solving the job shop problem Computational complexity of the job shop problem Open
More informationUNIVERSITY OF LYON DOCTORAL SCHOOL OF COMPUTER SCIENCES AND MATHEMATICS P H D T H E S I S. Specialty : Computer Science. Author
UNIVERSITY OF LYON DOCTORAL SCHOOL OF COMPUTER SCIENCES AND MATHEMATICS P H D T H E S I S Specialty : Computer Science Author Sérgio Rodrigues de Morais on November 16, 29 Bayesian Network Structure Learning
More informationAN ALGORITHM FOR DETERMINING WHETHER A GIVEN BINARY MATROID IS GRAPHIC
AN ALGORITHM FOR DETERMINING WHETHER A GIVEN BINARY MATROID IS GRAPHIC W. T. TUTTE. Introduction. In a recent series of papers [l4] on graphs and matroids I used definitions equivalent to the following.
More informationIE 680 Special Topics in Production Systems: Networks, Routing and Logistics*
IE 680 Special Topics in Production Systems: Networks, Routing and Logistics* Rakesh Nagi Department of Industrial Engineering University at Buffalo (SUNY) *Lecture notes from Network Flows by Ahuja, Magnanti
More informationSets and set operations
CS 441 Discrete Mathematics for CS Lecture 7 Sets and set operations Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square asic discrete structures Discrete math = study of the discrete structures used
More informationProbabilistic Networks An Introduction to Bayesian Networks and Influence Diagrams
Probabilistic Networks An Introduction to Bayesian Networks and Influence Diagrams Uffe B. Kjærulff Department of Computer Science Aalborg University Anders L. Madsen HUGIN Expert A/S 10 May 2005 2 Contents
More informationRelational Dynamic Bayesian Networks: a report. Cristina Manfredotti
Relational Dynamic Bayesian Networks: a report Cristina Manfredotti Dipartimento di Informatica, Sistemistica e Comunicazione (D.I.S.Co.) Università degli Studi MilanoBicocca manfredotti@disco.unimib.it
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note 11
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According
More information7 Communication Classes
this version: 26 February 2009 7 Communication Classes Perhaps surprisingly, we can learn much about the longrun behavior of a Markov chain merely from the zero pattern of its transition matrix. In the
More informationStudy Manual. Probabilistic Reasoning. Silja Renooij. August 2015
Study Manual Probabilistic Reasoning 2015 2016 Silja Renooij August 2015 General information This study manual was designed to help guide your self studies. As such, it does not include material that is
More informationArtificial Intelligence Mar 27, Bayesian Networks 1 P (T D)P (D) + P (T D)P ( D) =
Artificial Intelligence 15381 Mar 27, 2007 Bayesian Networks 1 Recap of last lecture Probability: precise representation of uncertainty Probability theory: optimal updating of knowledge based on new information
More informationECON 459 Game Theory. Lecture Notes Auctions. Luca Anderlini Spring 2015
ECON 459 Game Theory Lecture Notes Auctions Luca Anderlini Spring 2015 These notes have been used before. If you can still spot any errors or have any suggestions for improvement, please let me know. 1
More information10601. Machine Learning. http://www.cs.cmu.edu/afs/cs/academic/class/10601f10/index.html
10601 Machine Learning http://www.cs.cmu.edu/afs/cs/academic/class/10601f10/index.html Course data All uptodate info is on the course web page: http://www.cs.cmu.edu/afs/cs/academic/class/10601f10/index.html
More information3. Equivalence Relations. Discussion
3. EQUIVALENCE RELATIONS 33 3. Equivalence Relations 3.1. Definition of an Equivalence Relations. Definition 3.1.1. A relation R on a set A is an equivalence relation if and only if R is reflexive, symmetric,
More informationMarkov random fields and Gibbs measures
Chapter Markov random fields and Gibbs measures 1. Conditional independence Suppose X i is a random element of (X i, B i ), for i = 1, 2, 3, with all X i defined on the same probability space (.F, P).
More informationA Comparison of Novel and StateoftheArt Polynomial Bayesian Network Learning Algorithms
A Comparison of Novel and StateoftheArt Polynomial Bayesian Network Learning Algorithms Laura E. Brown Ioannis Tsamardinos Constantin F. Aliferis Vanderbilt University, Department of Biomedical Informatics
More informationGRAPH THEORY LECTURE 4: TREES
GRAPH THEORY LECTURE 4: TREES Abstract. 3.1 presents some standard characterizations and properties of trees. 3.2 presents several different types of trees. 3.7 develops a counting method based on a bijection
More informationUPDATES OF LOGIC PROGRAMS
Computing and Informatics, Vol. 20, 2001,????, V 2006Nov6 UPDATES OF LOGIC PROGRAMS Ján Šefránek Department of Applied Informatics, Faculty of Mathematics, Physics and Informatics, Comenius University,
More informationDecision Making under Uncertainty
6.825 Techniques in Artificial Intelligence Decision Making under Uncertainty How to make one decision in the face of uncertainty Lecture 19 1 In the next two lectures, we ll look at the question of how
More informationCSE 326, Data Structures. Sample Final Exam. Problem Max Points Score 1 14 (2x7) 2 18 (3x6) 3 4 4 7 5 9 6 16 7 8 8 4 9 8 10 4 Total 92.
Name: Email ID: CSE 326, Data Structures Section: Sample Final Exam Instructions: The exam is closed book, closed notes. Unless otherwise stated, N denotes the number of elements in the data structure
More informationIntelligent Systems: Reasoning and Recognition. Uncertainty and Plausible Reasoning
Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2015/2016 Lesson 12 25 march 2015 Uncertainty and Plausible Reasoning MYCIN (continued)...2 Backward
More informationGraph Theory Origin and Seven Bridges of Königsberg Rhishikesh
Graph Theory Origin and Seven Bridges of Königsberg Rhishikesh Graph Theory: Graph theory can be defined as the study of graphs; Graphs are mathematical structures used to model pairwise relations between
More information5.1 Bipartite Matching
CS787: Advanced Algorithms Lecture 5: Applications of Network Flow In the last lecture, we looked at the problem of finding the maximum flow in a graph, and how it can be efficiently solved using the FordFulkerson
More informationLecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett
Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.
More information8.1 Min Degree Spanning Tree
CS880: Approximations Algorithms Scribe: Siddharth Barman Lecturer: Shuchi Chawla Topic: Min Degree Spanning Tree Date: 02/15/07 In this lecture we give a local search based algorithm for the Min Degree
More informationData Structures Fibonacci Heaps, Amortized Analysis
Chapter 4 Data Structures Fibonacci Heaps, Amortized Analysis Algorithm Theory WS 2012/13 Fabian Kuhn Fibonacci Heaps Lacy merge variant of binomial heaps: Do not merge trees as long as possible Structure:
More informationLecture 7: NPComplete Problems
IAS/PCMI Summer Session 2000 Clay Mathematics Undergraduate Program Basic Course on Computational Complexity Lecture 7: NPComplete Problems David Mix Barrington and Alexis Maciel July 25, 2000 1. Circuit
More informationDiscuss the size of the instance for the minimum spanning tree problem.
3.1 Algorithm complexity The algorithms A, B are given. The former has complexity O(n 2 ), the latter O(2 n ), where n is the size of the instance. Let n A 0 be the size of the largest instance that can
More informationLecture 11: Graphical Models for Inference
Lecture 11: Graphical Models for Inference So far we have seen two graphical models that are used for inference  the Bayesian network and the Join tree. These two both represent the same joint probability
More information1 Definitions. Supplementary Material for: Digraphs. Concept graphs
Supplementary Material for: van Rooij, I., Evans, P., Müller, M., Gedge, J. & Wareham, T. (2008). Identifying Sources of Intractability in Cognitive Models: An Illustration using Analogical Structure Mapping.
More informationComplexity Theory. IE 661: Scheduling Theory Fall 2003 Satyaki Ghosh Dastidar
Complexity Theory IE 661: Scheduling Theory Fall 2003 Satyaki Ghosh Dastidar Outline Goals Computation of Problems Concepts and Definitions Complexity Classes and Problems Polynomial Time Reductions Examples
More informationApproximation Algorithms
Approximation Algorithms or: How I Learned to Stop Worrying and Deal with NPCompleteness Ong Jit Sheng, Jonathan (A0073924B) March, 2012 Overview Key Results (I) General techniques: Greedy algorithms
More informationClass notes Program Analysis course given by Prof. Mooly Sagiv Computer Science Department, Tel Aviv University second lecture 8/3/2007
Constant Propagation Class notes Program Analysis course given by Prof. Mooly Sagiv Computer Science Department, Tel Aviv University second lecture 8/3/2007 Osnat Minz and Mati Shomrat Introduction This
More informationSYSM 6304: Risk and Decision Analysis Lecture 5: Methods of Risk Analysis
SYSM 6304: Risk and Decision Analysis Lecture 5: Methods of Risk Analysis M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu October 17, 2015 Outline
More informationroot node level: internal node edge leaf node CS@VT Data Structures & Algorithms 20002009 McQuain
inary Trees 1 A binary tree is either empty, or it consists of a node called the root together with two binary trees called the left subtree and the right subtree of the root, which are disjoint from each
More informationGraphical Modeling for Genomic Data
Graphical Modeling for Genomic Data Carel F.W. Peeters cf.peeters@vumc.nl Joint work with: Wessel N. van Wieringen Mark A. van de Wiel Molecular Biostatistics Unit Dept. of Epidemiology & Biostatistics
More informationCS 598CSC: Combinatorial Optimization Lecture date: 2/4/2010
CS 598CSC: Combinatorial Optimization Lecture date: /4/010 Instructor: Chandra Chekuri Scribe: David Morrison GomoryHu Trees (The work in this section closely follows [3]) Let G = (V, E) be an undirected
More informationGraph Theory Problems and Solutions
raph Theory Problems and Solutions Tom Davis tomrdavis@earthlink.net http://www.geometer.org/mathcircles November, 005 Problems. Prove that the sum of the degrees of the vertices of any finite graph is
More informationScheduling Shop Scheduling. Tim Nieberg
Scheduling Shop Scheduling Tim Nieberg Shop models: General Introduction Remark: Consider non preemptive problems with regular objectives Notation Shop Problems: m machines, n jobs 1,..., n operations
More informationEuclidean Geometry. We start with the idea of an axiomatic system. An axiomatic system has four parts:
Euclidean Geometry Students are often so challenged by the details of Euclidean geometry that they miss the rich structure of the subject. We give an overview of a piece of this structure below. We start
More informationData Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1
Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2011 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields
More informationQuestion 2 Naïve Bayes (16 points)
Question 2 Naïve Bayes (16 points) About 2/3 of your email is spam so you downloaded an open source spam filter based on word occurrences that uses the Naive Bayes classifier. Assume you collected the
More informationBig Data, Machine Learning, Causal Models
Big Data, Machine Learning, Causal Models Sargur N. Srihari University at Buffalo, The State University of New York USA Int. Conf. on Signal and Image Processing, Bangalore January 2014 1 Plan of Discussion
More informationSingleLink Failure Detection in AllOptical Networks Using Monitoring Cycles and Paths
SingleLink Failure Detection in AllOptical Networks Using Monitoring Cycles and Paths Satyajeet S. Ahuja, Srinivasan Ramasubramanian, and Marwan Krunz Department of ECE, University of Arizona, Tucson,
More informationFollow links for Class Use and other Permissions. For more information send email to: permissions@pupress.princeton.edu
COPYRIGHT NOTICE: Ariel Rubinstein: Lecture Notes in Microeconomic Theory is published by Princeton University Press and copyrighted, c 2006, by Princeton University Press. All rights reserved. No part
More informationConnectivity and cuts
Math 104, Graph Theory February 19, 2013 Measure of connectivity How connected are each of these graphs? > increasing connectivity > I G 1 is a tree, so it is a connected graph w/minimum # of edges. Every
More informationDimitris Chatzidimitriou. Corelab NTUA. June 28, 2013
AN EFFICIENT ALGORITHM FOR THE OPTIMAL STABLE MARRIAGE PROBLEM Dimitris Chatzidimitriou Corelab NTUA June 28, 2013 OUTLINE 1 BASIC CONCEPTS 2 DEFINITIONS 3 ROTATIONS 4 THE ALGORITHM DIMITRIS CHATZIDIMITRIOU
More informationTriangulation by Ear Clipping
Triangulation by Ear Clipping David Eberly Geometric Tools, LLC http://www.geometrictools.com/ Copyright c 19982016. All Rights Reserved. Created: November 18, 2002 Last Modified: August 16, 2015 Contents
More information1. Nondeterministically guess a solution (called a certificate) 2. Check whether the solution solves the problem (called verification)
Some N P problems Computer scientists have studied many N P problems, that is, problems that can be solved nondeterministically in polynomial time. Traditionally complexity question are studied as languages:
More informationUsing Bayesian Networks to Analyze Expression Data ABSTRACT
JOURNAL OF COMPUTATIONAL BIOLOGY Volume 7, Numbers 3/4, 2 Mary Ann Liebert, Inc. Pp. 6 62 Using Bayesian Networks to Analyze Expression Data NIR FRIEDMAN, MICHAL LINIAL, 2 IFTACH NACHMAN, 3 and DANA PE
More information#1 Make sense of problems and persevere in solving them.
#1 Make sense of problems and persevere in solving them. 1. Make sense of problems and persevere in solving them. Interpret and make meaning of the problem looking for starting points. Analyze what is
More informationA NonLinear Schema Theorem for Genetic Algorithms
A NonLinear Schema Theorem for Genetic Algorithms William A Greene Computer Science Department University of New Orleans New Orleans, LA 70148 bill@csunoedu 5042806755 Abstract We generalize Holland
More informationHUGIN Expert White Paper
HUGIN Expert White Paper White Paper Company Profile History HUGIN Expert Today HUGIN Expert vision Our Logo  the Raven Bayesian Networks What is a Bayesian network? The theory behind Bayesian networks
More informationHidden Markov Models. Terminology and Basic Algorithms
Hidden Markov Models Terminology and Basic Algorithms Motivation We make predictions based on models of observed data (machine learning). A simple model is that observations are assumed to be independent
More informationCoverability for Parallel Programs
2015 http://excel.fit.vutbr.cz Coverability for Parallel Programs Lenka Turoňová* Abstract We improve existing method for the automatic verification of systems with parallel running processes. The technique
More informationDETERMINING THE CONDITIONAL PROBABILITIES IN BAYESIAN NETWORKS
Hacettepe Journal of Mathematics and Statistics Volume 33 (2004), 69 76 DETERMINING THE CONDITIONAL PROBABILITIES IN BAYESIAN NETWORKS Hülya Olmuş and S. Oral Erbaş Received 22 : 07 : 2003 : Accepted 04
More informationMath 141. Lecture 2: More Probability! Albyn Jones 1. jones@reed.edu www.people.reed.edu/ jones/courses/141. 1 Library 304. Albyn Jones Math 141
Math 141 Lecture 2: More Probability! Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Outline Law of total probability Bayes Theorem the Multiplication Rule, again Recall
More informationVector storage and access; algorithms in GIS. This is lecture 6
Vector storage and access; algorithms in GIS This is lecture 6 Vector data storage and access Vectors are built from points, line and areas. (x,y) Surface: (x,y,z) Vector data access Access to vector
More informationChapter 28. Bayesian Networks
Chapter 28. Bayesian Networks The Quest for Artificial Intelligence, Nilsson, N. J., 2009. Lecture Notes on Artificial Intelligence, Spring 2012 Summarized by Kim, ByoungHee and Lim, ByoungKwon Biointelligence
More informationCompetitive Analysis of On line Randomized Call Control in Cellular Networks
Competitive Analysis of On line Randomized Call Control in Cellular Networks Ioannis Caragiannis Christos Kaklamanis Evi Papaioannou Abstract In this paper we address an important communication issue arising
More informationScheduling Home Health Care with Separating Benders Cuts in Decision Diagrams
Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams André Ciré University of Toronto John Hooker Carnegie Mellon University INFORMS 2014 Home Health Care Home health care delivery
More informationApplications of Methods of Proof
CHAPTER 4 Applications of Methods of Proof 1. Set Operations 1.1. Set Operations. The settheoretic operations, intersection, union, and complementation, defined in Chapter 1.1 Introduction to Sets are
More informationprinceton univ. F 13 cos 521: Advanced Algorithm Design Lecture 6: Provable Approximation via Linear Programming Lecturer: Sanjeev Arora
princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 6: Provable Approximation via Linear Programming Lecturer: Sanjeev Arora Scribe: One of the running themes in this course is the notion of
More informationConditional Probability, Hypothesis Testing, and the Monty Hall Problem
Conditional Probability, Hypothesis Testing, and the Monty Hall Problem Ernie Croot September 17, 2008 On more than one occasion I have heard the comment Probability does not exist in the real world, and
More informationReading 13 : Finite State Automata and Regular Expressions
CS/Math 24: Introduction to Discrete Mathematics Fall 25 Reading 3 : Finite State Automata and Regular Expressions Instructors: Beck Hasti, Gautam Prakriya In this reading we study a mathematical model
More informationHandout #Ch7 San Skulrattanakulchai Gustavus Adolphus College Dec 6, 2010. Chapter 7: Digraphs
MCS236: Graph Theory Handout #Ch7 San Skulrattanakulchai Gustavus Adolphus College Dec 6, 2010 Chapter 7: Digraphs Strong Digraphs Definitions. A digraph is an ordered pair (V, E), where V is the set
More informationCircuits 1 M H Miller
Introduction to Graph Theory Introduction These notes are primarily a digression to provide general background remarks. The subject is an efficient procedure for the determination of voltages and currents
More informationCOMS4236: Introduction to Computational Complexity. Summer 2014
COMS4236: Introduction to Computational Complexity Summer 2014 Mihalis Yannakakis Lecture 17 Outline conp NP conp Factoring Total NP Search Problems Class conp Definition of NP is nonsymmetric with respect
More informationA Fast Algorithm For Finding Hamilton Cycles
A Fast Algorithm For Finding Hamilton Cycles by Andrew Chalaturnyk A thesis presented to the University of Manitoba in partial fulfillment of the requirements for the degree of Masters of Science in Computer
More informationExtending the Role of Causality in Probabilistic Modeling
Extending the Role of Causality in Probabilistic Modeling Joost Vennekens, Marc Denecker, Maurice Bruynooghe {joost, marcd, maurice}@cs.kuleuven.be K.U. Leuven, Belgium Causality Causality is central concept
More information2.3 Scheduling jobs on identical parallel machines
2.3 Scheduling jobs on identical parallel machines There are jobs to be processed, and there are identical machines (running in parallel) to which each job may be assigned Each job = 1,,, must be processed
More informationElementary Number Theory We begin with a bit of elementary number theory, which is concerned
CONSTRUCTION OF THE FINITE FIELDS Z p S. R. DOTY Elementary Number Theory We begin with a bit of elementary number theory, which is concerned solely with questions about the set of integers Z = {0, ±1,
More informationIdentifiability of PathSpecific Effects
Identifiability of athpecific Effects Chen vin, Ilya hpitser, Judea earl Cognitive ystems Laboratory Department of Computer cience University of California, Los ngeles Los ngeles, C. 90095 {avin, ilyas,
More informationThe Goldberg Rao Algorithm for the Maximum Flow Problem
The Goldberg Rao Algorithm for the Maximum Flow Problem COS 528 class notes October 18, 2006 Scribe: Dávid Papp Main idea: use of the blocking flow paradigm to achieve essentially O(min{m 2/3, n 1/2 }
More informationPreference Relations and Choice Rules
Preference Relations and Choice Rules Econ 2100, Fall 2015 Lecture 1, 31 August Outline 1 Logistics 2 Binary Relations 1 Definition 2 Properties 3 Upper and Lower Contour Sets 3 Preferences 4 Choice Correspondences
More informationCrossLayer Survivability in WDMBased Networks
CrossLayer Survivability in WDMBased Networks Kayi Lee, Member, IEEE, Eytan Modiano, Senior Member, IEEE, HyangWon Lee, Member, IEEE, Abstract In layered networks, a single failure at a lower layer
More informationCartesian Products and Relations
Cartesian Products and Relations Definition (Cartesian product) If A and B are sets, the Cartesian product of A and B is the set A B = {(a, b) :(a A) and (b B)}. The following points are worth special
More informationModelbased Synthesis. Tony O Hagan
Modelbased Synthesis Tony O Hagan Stochastic models Synthesising evidence through a statistical model 2 Evidence Synthesis (Session 3), Helsinki, 28/10/11 Graphical modelling The kinds of models that
More informationPOWER SETS AND RELATIONS
POWER SETS AND RELATIONS L. MARIZZA A. BAILEY 1. The Power Set Now that we have defined sets as best we can, we can consider a sets of sets. If we were to assume nothing, except the existence of the empty
More information