An algorithm to find a perfect map for graphoid structures

Similar documents
Lecture 17 : Equivalence and Order Relations DRAFT

Removing Partial Inconsistency in Valuation- Based Systems*

Mathematics for Computer Science/Software Engineering. Notes for the course MSM1F3 Dr. R. A. Wilson

Determination of the normalization level of database schemas through equivalence classes of attributes

A Bayesian Approach for on-line max auditing of Dynamic Statistical Databases

Scheduling Shop Scheduling. Tim Nieberg

Chapter 3. Cartesian Products and Relations. 3.1 Cartesian Products

1 if 1 x 0 1 if 0 x 1

Testing LTL Formula Translation into Büchi Automata

Cycles in a Graph Whose Lengths Differ by One or Two

Regular Languages and Finite Automata

5 Directed acyclic graphs

Midterm Practice Problems

5.1 Bipartite Matching

Lecture 16 : Relations and Functions DRAFT

Continued Fractions and the Euclidean Algorithm

Regular Expressions and Automata using Haskell

INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS

On Integer Additive Set-Indexers of Graphs

Generating models of a matched formula with a polynomial delay

1. Prove that the empty set is a subset of every set.

SOLUTIONS TO ASSIGNMENT 1 MATH 576

Handout #1: Mathematical Reasoning

3. Mathematical Induction

Optimal Index Codes for a Class of Multicast Networks with Receiver Side Information

LEARNING OBJECTIVES FOR THIS CHAPTER

Tree-representation of set families and applications to combinatorial decompositions

Data Structures and Algorithms Written Examination

The Graphical Method: An Example

CHAPTER 7 GENERAL PROOF SYSTEMS

Agenda. Interface Agents. Interface Agents

A Performance Comparison of Five Algorithms for Graph Isomorphism

INCIDENCE-BETWEENNESS GEOMETRY

Complexity Theory. IE 661: Scheduling Theory Fall 2003 Satyaki Ghosh Dastidar

A Model-driven Approach to Predictive Non Functional Analysis of Component-based Systems

Conditional Independence in Evidence Theory

Definition Given a graph G on n vertices, we define the following quantities:

EFFICIENT KNOWLEDGE BASE MANAGEMENT IN DCSP

University of Potsdam Faculty of Computer Science. Clause Learning in SAT Seminar Automatic Problem Solving WS 2005/06

On the k-path cover problem for cacti

UPDATES OF LOGIC PROGRAMS

MINIMAL BOOKS OF RATIONALES

The Basics of Graphical Models

Exponential time algorithms for graph coloring

Lecture Note 1 Set and Probability Theory. MIT Spring 2006 Herman Bennett

E3: PROBABILITY AND STATISTICS lecture notes

Reading 13 : Finite State Automata and Regular Expressions

Practical Guide to the Simplex Method of Linear Programming

Integrating Benders decomposition within Constraint Programming

Duality of linear conic problems

Guessing Game: NP-Complete?

A first step towards modeling semistructured data in hybrid multimodal logic

Efficient Utility Functions for Ceteris Paribus Preferences

Formal Languages and Automata Theory - Regular Expressions and Finite Automata -

Outline. NP-completeness. When is a problem easy? When is a problem hard? Today. Euler Circuits

ON THE COEFFICIENTS OF THE LINKING POLYNOMIAL

Clique coloring B 1 -EPG graphs

SEQUENCES OF MAXIMAL DEGREE VERTICES IN GRAPHS. Nickolay Khadzhiivanov, Nedyalko Nenov

Basic Probability Concepts

Math 319 Problem Set #3 Solution 21 February 2002

The chromatic spectrum of mixed hypergraphs

On the Efficiency of Backtracking Algorithms for Binary Constraint Satisfaction Problems

Problem Set 7 Solutions

Tenacity and rupture degree of permutation graphs of complete bipartite graphs

Product irregularity strength of certain graphs

How To Solve The Stable Roommates Problem

International Journal of Information Technology, Modeling and Computing (IJITMC) Vol.1, No.3,August 2013

Chapter 6: Episode discovery process

Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams

Applied Algorithm Design Lecture 5

Classification of Cartan matrices

Random vs. Structure-Based Testing of Answer-Set Programs: An Experimental Comparison

FIXED POINT SETS OF FIBER-PRESERVING MAPS

Optimal Planning with ACO

The Application of Bayesian Optimization and Classifier Systems in Nurse Scheduling

Introduction to Logic in Computer Science: Autumn 2006

Coverability for Parallel Programs

Cartesian Products and Relations

SOLUTIONS TO EXERCISES FOR. MATHEMATICS 205A Part 3. Spaces with special properties

COMBINATORIAL PROPERTIES OF THE HIGMAN-SIMS GRAPH. 1. Introduction

Distributed Computing over Communication Networks: Maximal Independent Set

An algorithmic classification of open surfaces

COUNTING INDEPENDENT SETS IN SOME CLASSES OF (ALMOST) REGULAR GRAPHS

Chapter 14 Managing Operational Risks with Bayesian Networks

Efficient Data Structures for Decision Diagrams

A Tool for Generating Partition Schedules of Multiprocessor Systems

CONTRIBUTIONS TO ZERO SUM PROBLEMS

Small Maximal Independent Sets and Faster Exact Graph Coloring

Institut für Informatik Lehrstuhl Theoretische Informatik I / Komplexitätstheorie. An Iterative Compression Algorithm for Vertex Cover

Codes for Network Switches

Extending Semantic Resolution via Automated Model Building: applications

No: Bilkent University. Monotonic Extension. Farhad Husseinov. Discussion Papers. Department of Economics

Triangle deletion. Ernie Croot. February 3, 2010

Best Monotone Degree Bounds for Various Graph Parameters

On end degrees and infinite cycles in locally finite graphs

The minimum number of distinct areas of triangles determined by a set of n points in the plane

8 Divisibility and prime numbers

About the inverse football pool problem for 9 games 1

Approximation Algorithms

each college c i C has a capacity q i - the maximum number of students it will admit

Transcription:

An algorithm to find a perfect map for graphoid structures Marco Baioletti 1 Giuseppe Busanello 2 Barbara Vantaggi 2 1 Dip. Matematica e Informatica, Università di Perugia, Italy, e-mail: baioletti@dipmat.unipg.it 2 Dip. Metodi e Modelli Matematici, Università La Sapienza Roma, Italy, e-mail: {busanello, vantaggi}@dmmm.uniroma1.it Abstract. We provide a necessary and sufficient condition for the existence of a perfect map representing an independence model and we give an algorithm for checking this condition and drawing a perfect map, when it exists. Key words: Conditional independence models, Inferential rules, Acyclic directed graphs, Perfect map. 1 Introduction Graphical models [11, 12, 14 16, 20] play a fundamental role in probability and multivariate statistics and they have been deeply developed as a tool for representing conditional independence models. The usefulness of graphical models is not limited to the probabilistic setting, in fact they have been extended to other frameworks (see, e.g. [6 8, 13, 17]). Among graphical structures, we consider graphoids that are induced, for example, by a strictly positive probability under the classical notion of independence [9]. A relevant problem is to represent a set J of conditional independence relations, provided by an expert, by a directed acyclic graph (DAG), where independencies are encoded by d separation. Such a graph is called perfect map for J (see [14]). A DAG gives a very compact and human readable representation, unfortunately it is known that there exist sets of independencies which admit no perfect maps. The problem of the existence of a perfect map has been studied by many authors (see for instance [14]) by providing only partial answers in terms of necessary or sufficient conditions. In [2] we have introduced a sufficient condition for the existence of a perfect map in terms of existence of a certain ordering among the random variables, and we describe BN draw procedure in order to build the corresponding independence map given an ordering. The sufficient condition, as well as BN draw, uses the fast closure J of J [1]. From J it is possible to solve the implication problem for J and to extract independence maps with fast algorithms. The set J can be computed in a reasonable amount of time, as shown in [1, 3] and it

is extremely smaller than the complete closure J of J with respect to graphoid properties, even if it gathers the same information of J. A similar construction has been given in [15], essentially for the semi graphoids, and used in [10] to describe a necessary condition for the existence of a perfect map for semi graphoid structures. In this paper we provide a necessary and sufficient condition for the existence of a perfect map for graphoid structures. This condition relies on some constraints among the triples of the set J and their components. Moreover, we give an algorithm to check the existence of a perfect map based on the provided condition 1. In the positive case, the algorithm returns a relevant perfect map. 2 Graphoid Let S = {Y 1,..., Y n } be a finite not empty set of variables and S = {1,..., n} the set of indices associated to S. Furthermore, S (3) is the set of all (ordered) triples (A, B, C) of disjoint subsets of S, such that A and B are not empty. A conditional independence model I is a suitable subset of S (3). We refer to graphoid structure (S, I), with I ternary relation on the set S, satisfying the following properties (where A, B, C, D are pairwise disjoint subsets of S): G1 if (A, B, C) I, then (B, A, C) I (Symmetry); G2 if (A, B C, D) I, then (A, B, D) I (Decomposition); G3 if (A, B C, D) I, then (A, B, C D) I (Weak Union); G4 if (A, B, C D) I and (A, C, D) I, then (A, B C, D) I (Contraction); G5 if (A, B, C D) I and (A, C, B D) I, then (A, B C, D) I (Intersection). Given a triple θ = (A, B, C) we denote with θ T (B, A, C) the transpose triple obtained by applying G1 to θ. Given a set J of conditional independence statements, a relevant problem about graphoids is to find efficiently the closure of J with respect to G1 G5 J = {θ S (3) : θ is obtained from J by G1 G5}. A related problem, called implication, concerns to establish whether a triple θ S (3) can be derived from J. This implication problem can be easily solved once the closure has been computed. But, the computation of the closure is infeasible because its size is exponentially larger than the size of J. In [1] we have described how it is possible to compute a smaller set of triples having the same information as the closure. Now we recall some definitions and properties introduced and studied in [1], which are used in the rest of the paper. Given a pair of triples θ 1, θ 2 S (3), we say that θ 1 is generalized included in θ 2 (briefly g included), in symbol θ 1 θ 2, if θ 1 can be obtained from θ 2 by a finite number of applications of G1, G2 and G3. 1 An implementation of the proposed algorithms is available at http://www.dmi.unipg.it/baioletti/graphoids 2

Proposition 1. Given θ 1 = (A 1, B 1, C 1 ) and θ 2 = (A 2, B 2, C 2 ), then θ 1 θ 2 if and only if the following conditions hold (i) C 2 C 1 A 2 B 2 C 2 ; (ii) either A 1 A 2 and B 1 B 2 or A 1 B 2 and B 1 A 2. Generalized inclusion is strictly related to the concept of dominance [15]. In [1] we introduce a particular subset J of J (called fast closure ), which can be computed from J by discarding the not maximal triples τ J, i.e. those g included in some other triple of J. Moreover, in [1], we describe and compare two different algorithms to compute J, called FC2 and FC1. In particular, FC2 iteratively uses two inferential rules G4 and G5, related to G4 and G5, introduced always in [1], and discards not maximal triples, until the set of independence relations is closed. FC1 has a similar structure, but uses a single inference rule U, which corresponds to compute at once the fast closure of a couple of triples. For some considerations and experimental results (see also [3]) FC1 appears to be faster than FC2. 3 Graphs In the following, we refer to the usual graph definitions (see [14]): we denote by G = (U, E) a graph with set U of nodes and oriented arcs E (ordered pairs of nodes). In particular, we consider directed graphs having no cycles, i.e. acyclic directed graphs (DAG). As usual, we denote by pa(u), for any u U, the parent set of u. Definition 1. If A, B and C are three disjoint subsets of nodes in a DAG G, then C is said to d separate A from B, denoted (A, B, C) G, if for each non directed path between a node in A and a node in B, there exists a node x in the path which satisfies one of these two conditions 1. x is a collider (i.e. both edges point to x), x C and no descendant of x is in C; 2. x is not a collider and belongs to C. In order to study the representation of a conditional independence model, we need to distinguish between dependence map and independence map, since there are conditional independence models that cannot be completely represented by a DAG (see e.g. [12, 14]). In the following we denote with J (analogously for J, J ) both a set of triples and a set of conditional independence relations, obviously, the triples are defined on the set S and the independence relations on S. Definition 2. Let J be a set of conditional independence relations on a set S. A DAG G = (S, E) is a dependence map (briefly a D map) if for all triples (A, B, C) S (3) (A, B, C) J (A, B, C) G. 3

Moreover, G = (S, E) is an independence map (briefly an I map) if for all triples (A, B, C) S (3) (A, B, C) G (A, B, C) J. G is a minimal I map of J if deleting any arc, G is no more an I-map. G is said to be a perfect map (briefly a p map) if it is both a I map and a D map. The next definition and theorem [14] provide a tool to build a DAG given an independence model J. Definition 3. Let J be an independence model defined on S and let π =< π 1,..., π n > be an ordering of the elements of S. The boundary strata of J, relative to π, is an ordered set of subsets < B (1), B (2),..., B (m) > of S (with m n), such that each B (i) is a minimal set satisfying B (i) S (i) = {π 1,..., π i 1 } and γ i = ({π i }, S (i) \B (i), B (i) ) J. The DAG obtained by setting each B (i) as parent set of the node π i is called boundary DAG of J, relative to π. The introduced triple γ i is known as basic triple. The next theorem is an extension of Verma s Theorem [18] stated for conditional independence relations (see [14]). Theorem 1. Let J be a independence model closed with respect to the semi graphoid properties. If G is a boundary DAG of J, relative to any ordering π, then G is a minimal I map of J. Theorem 1 helps to build a DAG for an independence model J (induced by a probability P ) given an ordering π on indices of S. It is well known (see [14]) that the boundary DAG of J relative to π is a minimal I map. In the following, given an ordering π on S, G π is the corresponding I map of J. 4 Perfect map In [2] we have introduced some sufficient conditions for the existence of a perfect map, given the fast closure J, and described the algorithm Backtrack which checks these conditions and, in the affirmative case, builds a perfect map. Since these conditions are only sufficient, this algorithm can fail also in the cases where a perfect map exists. In [4] we have improved the previous result by introducing conditions which, under a suitable hypothesis, are necessary and sufficient for the existence of a perfect map. This partial characterization relies on some constraints among the triples of the set J and their components. In this paper we provide a necessary and sufficient condition valid also in the case where the previously cited hypothesis fails (for the proof see [5]). These condition fully characterizes the ordering from which a perfect map can be built. An algorithm able to check this condition and, in the positive case, to find a perfect map will be described in the next section. 4

In the following, we review the procedure BN draw introduced in [2], which builds the minimal I map G π of J (see Definition 2) given the fast closure J of J and an ordering π on S. This procedure is used by the algorithms described in [2] and in this paper. Note that, given the fast closure set J, it is not possible to apply the standard procedure (see [11, 14]), described in Definition 3, to draw an I map. In fact, the basic triples, related to an arbitrary ordering π, might not be elements of J, but they could be just g included in some triples of J (see Example in [2]). However, in [2] we have shown that it is easy to find the basic triples in the fast closure by using the following result, where, as in the rest of the paper, S (x) denotes the set of elements of S preceding x S, with respect to a given ordering π. Proposition 2. Let J be a set of independence relations on S, J its fast closure and π an ordering on S. For each x S, the set B x = {({x}, B, C) S (3) : B C = S (x), θ J with ({x}, B, C) θ} is not empty if and only if the basic triple γ x = ({x}, S (x) \ B (x), B (x) ) exists, and coincides with the unique maximal triple γ x of B x. In this paper, we describe a new version of BN draw which uses the following operation. For each θ = (A, B, C) S (3), let X = (A B C) and for any x S and P S, define P (A C) if C P X and x A Π(θ, P, x) = P (B C) if C P X and x B P otherwise. Algorithm 1 The set of parents of x function PARENTS(x, P, K) pa P for all θ K do p Π(θ, P, x) if p < pa then pa p end for return pa The procedure BN draw calls for each π i the function PARENTS and uses its results as parent set of π i. Given π, BN draw builds the minimal I map G π in linear time with respect to the cardinality m of J and the number of variables n. In fact, it is based on 5

Algorithm 2 DAG from J given an ordering π of S function BN draw(n, π, J ) P G a graph with S as vertex set and no edges for i 2 to n do P P {π i 1 } pa PARENTS(π i, P, J ) draw an arc in G from each index in pa to π i end for return G the function PARENTS which computes the set of parents of a given variable in O(m) steps. In each step, some set operations must be executed and this can be efficiently performed by using a compact representations for sets (e.g., as bit vectors). The space needed in memory by BN draw is almost exclusively used to store the fast closure (see [4]). The introduction of the function Π is important also for the definition of the necessary and sufficient condition for the existence of a p map. Theorem 2. A set J is representable with a p map if and only if there exists an ordering π such that for each θ = (A, B, C) J, let X = A B C, C1 for each c C such that S (c) A and S (c) B, there exists a triple θ c J such that Π(θ c, S (c), c) A = or Π(θ c, S (c), c) B = ; C2 for each a A such that S (a) B or S (a) (S \ X) there exists a triple θ a J such that Π(θ a, S (a), a) [B (S \ X)] = ; C3 for each b B such that S (b) A or S (b) (S \ X) there exists a triple θ b J such that Π(θ b, S (b), b) [A (S \ X)] = ; C4 for each c C such that S (c) (S \ X), there exists a triple θ c J such that Π(θ c, S (c), c) (S \ X) =. Proof. We give a sketch of the proof, for a complete proof see [5]. ( ) Suppose that G π is a p map for J, we need to prove that π satisfies the condition C1 (the other conditions follow similarly). Let θ = (A, B, C) be in J, if C1 were not satisfied, then there would exists an element c C, such that S (c) A and S (c) B. However, for any θ J ones has Π(θ, S (c), c) A and Π(θ, S (c), c) B. Hence, there exists α pa(c) A and β pa(c) B, so the path α c β would not be blocked by C. This is absurd since A is d separated from B by C. ( ) Conditions C1 C4 imply that for each x X, pa(x) X. Let ρ = (u 1,..., u l ) be a path and consider j = max{i : u i A} and l = min{i : u i B}. Then, j + 1 l 1, otherwise there would be an element of A having parents in B or vice versa. If u j+1 pa(u j ), then u j+1 C and, since it is not a collider, it blocks ρ. Similarly, if u l 1 pa(u l ). Now, suppose that u j+1 ch(u j ) and u l 1 ch(u l ), let r be such that any u i (i = j,..., l) precedes (according to π) 6

u r. Thus, u r is a collider. If u r C, then j + 1 = r = l 1 cannot be, otherwise u r would have parents both in A and in B. So, u r 1 or u r+1 is a parent of u r, belongs to C and blocks ρ. Otherwise, no descendent of u r belongs to C, so u r blocks ρ. The conditions C1 C4 are not so easy to check from the computational point of view, because they require for each triple in J and for x X to verify some constraints and, when some of them do not hold, a suitable triple in J needs to be found. In the worst case, this process requires O(m 2 ) steps, for each possible ordering. In the next section we will describe a more efficient way of achieving the same result. 5 The algorithm In this section we show how to use Theorem 2 to check whether J is representable by a graph, and in the affirmative case to find a perfect map. The main procedure is REPRESENT where [ ] denotes an empty sequence Algorithm 3 Main function for representability function REPRESENT(J ) PREPROCESS(J ) return SEARCH([ ], 1, S, J ) of integers. The function PREPROCESS will be described in the following. The recursive function SEARCH incrementally tries to build an ordering π satisfying conditions C1 C4 of Theorem 2. It returns the element if it fails into finding such an ordering. At the i th recursive call it attempts to fix the i th element in π, by selecting each of the remaining variables. For each possible variable x, the procedure CHECK CONDS checks whether the conditions C1 C4 are not violated by setting π i as x. In the positive case, it calls itself until a complete ordering is obtained. If no variable can be set at the i th place of π, then the recursive call fails and a revision of the previously chosen variables is performed (backtracking). To check whether the choice of π i as x is correct we must verify whether the conditions C1 C4 are satisfied for all the triples in which x appears. Note that we know all the variables preceding x, in fact S (x) is exactly the set {π 1, π 2,..., π i 1 }. Hence, it is possible to compute the set of parents Q of x in the graph candidate to be a perfect map. Let θ = (A, B, C) be a triple containing x. If x appears in C, then only conditions C1 and C4 must be checked. Let us see how to handle condition C1. It basically requires that if P intersects both A and B, there must exist a triple τ K such that Π(τ, S (x), x) does not intersect both A and B. 7

Algorithm 4 Backtracking procedure function SEARCH(π, i, V, K) if V = then return BN draw(π, K) else for all x V do π i x if CHECK CONDS(π, i, K) then G SEARCH(π, i + 1, V \ {x}, K) if G then return G end if end for return end if But, since we know that the set Q, the parents of x, is the smallest set among the sets Π(τ, S (x), x), for τ K, then it is sufficient to check if Q does not intersect A and B at the same time. For conditions C2, C3, and C4 the situation is much easier. In fact, before starting the search process, we can compute, first of all, for each x S the set NP (x) of non parents, i.e. those elements of S which cannot be parents of x, otherwise one of the conditions C2, C3, or C4 would be violated, by means of the function PREPROCESS. Hence, to check the above mentioned conditions it is sufficient to verify whether Q does not intersect N P (x). Unfortunately, this preprocessing cannot work for condition C1. Algorithm 5 Preprocessing for conditions C2 C4 function PREPROCESS(K) for all x S : NP (x) for all θ = (A, B, C) K do X A B C R S \ X for all x A : NP (x) NP (x) B R for all x B : NP (x) NP (x) A R for all x C : NP (x) NP (x) R end for The cost of the entire procedure can be estimated as follows. Let us recall that n is the number of the variables and m is the cardinality of J. The function CHECK CONDS requires at most O(m) steps. The number of the steps of SEARCH is in the worst case exponential in n, but backtracking can hopefully perform an early pruning on not promising orderings, so to avoid many 8

Algorithm 6 Checking conditions C1 C4 function CHECK CONDS(π, i, K) P π[1,..., i 1] x π i Q PARENTS(x, P, K) if NP (x) Q then return FALSE end if for all θ = (A, B, C) K do if (x C) (Q A ) (Q B ) then return FALSE end for return TRUE useless computation steps. A great impact, as in other backtracking procedures, is given by the order in which the variable are chosen in the instruction for all x V. We will discuss this point in the conclusion. Finally, note that SEARCH can avoid at all to call BN draw by storing, for each x S, the sets Q computed in the function CHECK CONDS. 6 Conclusions We provide a necessary and sufficient condition for the existence of a perfect map representing a set of conditional independence relations and we provide an algorithm which finds a perfect map when it exists. This algorithm can be improved in many ways. First of all, by using suitable data structures we can reduce the time for searching variables occurring in set of triples, for instance representing J as a bipartite graph, where each variable is linked to the triples in which appears and each triple is linked to the variables which contains. Second, we will investigate the use of some heuristic rules that help the procedure SEARCH. For instance, the well known CSP techniques, like fail first or min conflicts, could be used to order the variables and to reduce the number of attempts. A simple way to have a sort of the fail first heuristic is to choose the variables in decreasing order with respect to their corresponding N P (x). Other useful CSP technique could be a non chronological backtracking, in which the cause of the failure is detected and all the choices, which led to the failure, are undone. Moreover, another technique is learning, in which the forbidden ordering constraints are learned from the failures. Third, we could introduce a further preprocessing phase, in which it would be possible to deduce, from the triple of the fast closure, a list of impossible ordering constraints among variables. Another aspect that will be worth to be investigated is when a set J is not representable by a p map, how to determine a subset J of J, hopefully as large as possible, such that J is representable. 9

Finally, another possible way of enhancing this result is to find a new characterization for the existence of a p map, which can generate a faster algorithm. References 1. M. Baioletti, G. Busanello, B. Vantaggi (2009), Conditional independence structure and its closure: inferential rules and algorithms. Int. J. of Approx. Reason., 50, pp. 1097 1114. 2. M. Baioletti, G. Busanello, B. Vantaggi (2009). Acyclic directed graphs to represent conditional independence models. Lecture notes LNAI 5590, pp. 530 541. 3. M. Baioletti, G. Busanello, B. Vantaggi (2009). Closure of independencies under graphoid properties: some experimental results. Proc. 6th Int. Symp. on Imprecise Probability: Theories and Applications, pp. 11 19. 4. M. Baioletti, G. Busanello, B. Vantaggi (2009). Acyclic directed graphs representing independence models. Int. J. of Approx. Reason. (submitted). 5. M. Baioletti, G. Busanello, B. Vantaggi (2010). Necessary and sufficient conditions for the existence of a perfect map. Tech. Rep. 01/2010. Univ. of Perugia. 6. G. Coletti, R. Scozzafava (2002). Probabilistic logic in a coherent setting. Dordrecht/Boston/London: Kluwer (Trends in logic n.15). 7. F.G. Cozman, T. Seidenfeld (2007). Independence for full conditional measures, graphoids and Bayesian networks, Boletim BT/PMR/0711 Escola Politecnica da Universidade de Sao Paulo, Sao Paulo, Brazil. 8. F. G. Cozman, P. Walley (2005). Graphoid properties of epistemic irrelevance and independence. Ann. of Math. and Art. Intell., 45, pp. 173 195. 9. A.P. Dawid (1979). Conditional independence in statistical theory. J. Roy. Stat. Soc. B, 41, pp. 15 31. 10. P. R. de Waal, L. C. van der Gaag (2005). Stable Independence in Perfect Maps. Proc. 21st Conf. in Uncertainty in Artificial Intelligence, UAI 05, Edinburgh, pp. 161 168. 11. F.V. Jensen (1996). An introduction to Bayesian networks. UCL Press, Springer- Verlag. 12. S.L. Lauritzen (1996). Graphical models. Clarendon Press, Oxford. 13. S. Moral, A. Cano (2002). Strong conditional independence for credal sets. Ann. of Math. and Art. Intell., 35, pp. 295 321. 14. J. Pearl (1988). Probabilistic reasoning in intelligent systems: networks of plausible inference, Morgan Kaufmann, Los Altos, CA. 15. M. Studený (1997). Semigraphoids and structures of probabilistic conditional independence. Ann. of Math. Artif. Intell., 21, pp. 71 98. 16. M. Studený (2005). Probabilistic conditional independence structures, Springer- Verlag, London. 17. B. Vantaggi (2003). Conditional independence structures and graphical models. Int. J. Uncertain. Fuzziness Knowledge-Based Systems, 11(5), pp. 545 571. 18. T. S. Verma (1986). Causal networks: semantics and expressiveness. Tech. Rep. R 65, Cognitive Systems Laboratory, University of California, Los Angeles. 19. T. S. Verma, J. Pearl (1991). Equivalence and synthesis of causal models. Uncertainty in Artificial Intelligence, 6, pp. 220 227. 20. J. Witthaker (1990). Graphical models in applied multivariate statistic. Wiley & Sons, New York. 10