GeNIeRate: An Interactive Generator of Diagnostic Bayesian Network Models


 Emma Andra Cameron
 2 years ago
 Views:
Transcription
1 GeNIeRate: An Interactive Generator of Diagnostic Bayesian Network Models Pieter C. Kraaijeveld Man Machine Interaction Group Delft University of Technology Mekelweg 4, 2628 CD Delft, the Netherlands Marek J. Druzdzel Decision Systems Laboratory University of Pittsburgh Pittsburgh, PA, 15260, USA Abstract Constructing diagnostic Bayesian network models is a complex and time consuming task. In this paper, we propose a methodology to simplify and speed up the design of very large models. The models are based on two simplifying assumptions: (1) the structure of the model has three levels of variables and (2) the interaction among the variables can be modeled by NoisyMAX gates. The methodology is implemented in an application named: GeNIeRate, which aims at supporting construction of diagnostic Bayesian network models consisting of hundreds or even thousands of variables. Preliminary qualitative evaluation of this application shows great promise. We are planning to conduct a systematic study to compare GeNIeRate to traditional techniques for building Bayesian network models and we hope to be able to present the results at the workshop. 1 Introduction Bayesian Networks (BN) [Pearl, 1988] are acyclic directed graphs with each node representing a variable and each arc representing typically a causal relation among two variables. Although exact and approximate inference in Bayesian networks are both worstcase NPhard [Cooper, 1990; Dagum and Luby, 1997], they still perform well, in our experience, for practical diagnostic models consisting of several hundred or even thousand nodes. A BN consists of a qualitative and a quantitative part. The qualitative part is an acyclic directed graph reflecting the causal structure of the domain, the quantitative part represents the joint probability distribution over its variables. Every variable has a conditional probability table (CPT) representing the probabilities of each state given the state of the parent variable. If a variable does not have any parent variables in the graph, the CPT represents the prior probability distribution of the variable. A BN is able to calculate the posterior probability of an uncertain variable given some evidence obtained from related variables. This property and the intuitive way BN model complex relationships among uncertain variables makes it a very suitable technique for building diagnostic models. Diagnosis is quite likely the most successful practical application of BN. While the existing diagnostic BN models perform very well, the technique is still not widely used and accepted. One of the main reasons for this is that building a BN model is a laborious and time consuming task. During the model building process, both the qualitative and quantitative parts of the BN have to be designed. This can be done in three different ways: (1) both structure and parameters can be learned from data without human interaction, (2) consulting a domain expert to design the structure and the parameters, or (3) combine learning from data and consulting a domain expert. In order to learn successful diagnostic BN models from data one would need a very large data set, which is almost never available for diagnostic models. So building a diagnostic BN model will most of the time come down to an interaction of a knowledge engineer and a domain expert. They will consult technical manuals, test procedures, and repair databases to define the variables in that domain, determine the interactions among them, and to elicit the parameters. To give an idea of the time used to design a diagnostic BN model: the construction of the HEPARII model [Oniśko et al., 2001] used to diagnose liver disorders took approximately 300+ hours of which roughly 50 hours were spent with domain experts. The model consisted of 73 variables and its numerical parameters were learned from a data set of patient cases. One of the first large diagnostic systems that used Bayesian networks was the QMRDT system [Shwe et al., 1991]. This system was a probabilistic reformulation of the Quick Medical Reference (QMR), a rulebased system based on the INTERNIST1 knowledge base developed at the University of Pittsburgh [Miller et al., 1982]. The QMRDT system contains approximately 5000 variables. The authors made several simplifying assumptions to deal with the complexity of constructing this network. The system was a BN with only two levels of variables representing two different types of variables: diseases and findings. The structure was such that the disease variables influence the outcome of the findings. While this structure was simple, its performance was close to the original QMR. But since the users of the system knew the independence assumptions the inconsistencies of QMRDT could easily be explained. Inspired by the QMRDT system, we propose in this paper a methodology to make diagnostic BN models in an intu
2 itive and fast way for users who are not necessarily experts in Bayesian networks. Using our methodology, a domain expert will be able to build a model without any interaction with a knowledge engineer. Skaanning [2000] addresses the same problem applied in an application called BATS Author. The BATS Author hides the causality of the models from the user and also has the assumption that only one single fault can occur. Our methodology supports multiple faults and shows the causal structure of the graph. We believe that this will support the user in understanding what she is doing while building a diagnostic BN model. To keep the model building process simple for the user, the methodology is based on some simplifying assumptions. The first assumption is in the qualitative part of the model. The structure is such that it can only consist of three levels of variables. The top level represents context variables, the middle level represents the fault variables and the bottom level represents evidence which are the effects of when a fault occurs. Relations are only allowed in a topdown way: from contexts to faults and from faults to evidences. The second simplification is in the quantitative part. All the variables are modeled using NoisyMAX gates. A NoisyMAX gate is a generalization of the NoisyOR gate for multivalued variables. The goal of using the NoisyMAX gate is to reduce the size of the CPT s of the variables. It implies that the number of parameters that have to be elicited by the user decreases from exponential to linear in the number of parents of that variable. Research shows that many interactions in practical models can be approximated by the NoisyMAX gates [Zagorecki and Druzdzel, 2004]. With these simplifications we sacrifice some theoretical modeling power and precision but we believe, and we plan to test this in practice, that the resulting models will not be significantly less accurate while being much easier to build. Our proposed methodology is implemented in an application named: GeNIeRate. This program also contains different features to speed up the model building process. The remainder of this paper is structured as follows. Section 2 gives an introduction about Bayesian networks. Section 3 discusses our proposed three level structure of diagnostic BN models. Section 4 will explain the NoisyMAX gate. Section 5 gives a complete description of GeNIeRate. Finally, Section 6 discusses a description of a pilot experiment testing GeNIeRate. 2 Bayesian Networks Bayesian networks are acyclic directed graphs in which nodes represent random variables and arcs represent direct probabilistic dependencies among them. A Bayesian network encodes the joint probability distribution over a set of variables {X 1,..., X n }, where n is finite, and decomposes it into a product of conditional probability distributions over each variable given its parents in the graph. In case of nodes with no parents, prior probability is used. The joint probability distribution over {X 1,..., X n } can be obtained by taking the product of all of these prior and conditional probability distributions: Pr(x 1,..., x n ) = n Pr(x i P a(x i )). (1) i=1 Figure 1 shows a highly simplified example Bayesian network modeling causes of a car engine failing to start. The variables in this model are: Age of the car (A), dead Battery (B), dirty Connectors (C), Engine does not start (E) and Red warning lights on your dashboard (R). For the sake of simplicity, we assumed that each of these variables is binary. For example, R has two outcomes, denoted r and r, representing Red warning lights are present and Red warning lights are absent, respectively. A directed arc between B and E denotes the fact that whether or not the battery is dead will impact the likelihood of the engine failing to start. Similarly, an arc from A to B denotes that the age of the car influences the likelihood of having a dead battery. A B C R E Figure 1: An example Bayesian network for engine problem Lack of directed arcs is also a way of expressing knowledge, notably assertions of (conditional) independence. For instance, lack of a directed arc between A and R encodes the knowledge that the age of the car does not influence the chance of the car having red lights at the dashboard, only indirectly through the variable dead battery B. These causal assertions can be translated into statements of conditional independence: R is independent of A given B. In mathematical notation, Pr(r b) = Pr(r b, a). Similarly, the absence of arc B C means that whether or not the car has a dead battery will not influence the chance of having dirty connectors. These independence properties imply that: Pr(a, b, c, r, e) = Pr(a) Pr(b a) Pr(c a) Pr(r b) Pr(e a, b, c), i.e., that the joint probability distribution over the graph nodes can be factored into the product of the conditional probabilities of each node given its parents in the graph. Please note that this expression is just an instance of Equation 1. The assignment of values to observed variables is usually called evidence. The most important type of reasoning in a probabilistic system based on Bayesian networks is known as belief updating or evidence propagation, which amounts to
3 computing the probability distribution over the variables of interest given the evidence. This evidence propagation makes Bayesian networks very suitable for diagnosis. For example, in the model of Figure 1, the variable of interest for diagnosis could be B and the focus of computation could be the posterior probability distribution over B given the observed values of A, R, and E, i.e., Pr(b a, r, e). Bayesian network software can be applied to calculate this posterior probability. Today s software is capable of very fast belief updating in models consisting of hundreds or even thousands of variables. After the belief updating the software can make a decision or support the user in making a decision what actions to perform given that probability. 3 The BN3M model We believe that there are three fundamental types of variables in diagnostic models. The first type (1) are variables representing the failures of the device or a specific part of the device. We will call these failure variables: Faults. The second type (2) are variables which have an observable effect if the device is in some faulty state. Sometimes you cannot observe the effect of the given fault clearly, then you can perform a test which will give you information about the state of the device. These observation or test variables will be called: Evidence variables. The last type of variables (3) are variables which indicate context properties of the device that may influence the risk of causing a fault. We will call these variables: Context variables. Example context variables are the age of the device or the history of failures of the device. In the QMRDT model, mentioned in the introduction, the authors distinguish two types of binary variables: diseases and findings. These variables are graphically structured in two levels in which the disease variables influence the findings. This structure is based on some independence assumptions. The first of these is marginal independence of the diseases, which amounts to no arcs among the disease variables. The second is conditional independence of the findings, which amounts to no arcs among finding variables. Figure 2 shows an example of the graphical structure of the QMRDT model. Diseases Findings Figure 2: The twolevel QMRDT model The structure of the QMRDT model was simple, yet the performance of the model was close to that of the original rulebased QMR system. This motivated us to design an extension of their structure to connect our three types of variables in a diagnostic BN model. The authors of the QMR DT system already mention the fact that assuming their two level structure is not always very accurate. They state that it would be more accurate to model some of their findings representing, for example, historical findings as parent variables of the diseases. We decided to cover this inaccuracy in our model in making the context variables the parents of the fault variables. We also decided to support multivalued variables instead of binary variables used in QMRDT. This means that our structure becomes a three level structure with the context variables on top, fault variables in the middle and the evidence variables at the bottom. We call this structure: 3level Bayesian network using NoisyMAX gates (BN3M). An example of this structure is showed in Figure 3. We will discuss the NoisyMAX gate in Section 4. Context Faults Evidence Figure 3: The BN3M model As mentioned in the introduction, this structure sacrifices some modeling power and precision. For example, the arc between variables A and E of the example given in Figure 1 cannot be created in our methodology. Whether this theoretical imprecision has a big impact on the practical performance of a large model remains a question which we are planning to test with a systematic study. A model that is theoretically very precise may turn out to be inferior in practice because elicitation of a huge number of parameters from a human expert may decrease their quality. 4 The Leaky NoisyMAX gate In order to gain speed in designing the quantitative part of the model, the interaction among the variables in our structure is approximated by canonical interaction models that require fewer parameters. One type of canonical interaction, widely used in Bayesian networks, is known as the NoisyOR gate. This gate was first introduced outside the BN domain by [Good, 1961]. Later it was applied in the context of BN s [Pearl, 1986], and it became very popular among BN model builders because it reduces the growth of a CPT of a variable within a BN from exponential to linear in the number of parents. An extension of the NoisyOR gate for multivalued variables is the NoisyMAX gate [Henrion, 1989; Díez, 1993]. For the sake of simplicity, we will discuss the NoisyOR gate in depth and give a short description of the extended NoisyMAX gate later in this section. The NoisyOR gate models a nondeterministic interaction among n binary parent cause variables X and a binary effect variable Y. Every variable has two states: a distinguished state, which represents that the variable is in its normal working state. Commonly this is absent or false and a nondistinguished state: truth or present. The effect variable
4 Y works as a deterministic OR gate. This means that if all the parent variables are absent, the child variable is also absent. However if a parent variable X i is present and all other parent variables are absent, it has a probability p i of causing the effect y. These probabilities p i address the noisy property of the gate and have to be elicited by the model builder or can be learned from data. The probabilities are fairly easy to understand since they can be represented by questions like: What is the probability that the effect y will occur, given that only one cause X i is present and all other causes are absent? In other words, p i = P r(y x 1, x 2,..., x i,..., x n 1, x n ). (2) The probability of an effect y to occur given a subset X of causes which are present is now formally given by: n p(y X) = 1 (1 p i ). (3) i=1 This formula is sufficient to derive the complete CPT of Y conditional on its predecessors X 1, X 2,..., X n. Henrion [1989] proposed a direct extension of the Noisy OR gate which models that an effect y can also occur if all the causes are absent and called it: leaky NoisyOR gate. This can be modeled by introducing an additional parameter p 0, which is called the leak probability, formally given by: p 0 = P r(y x 1, x 2,..., x n ). (4) The leak probability represents the phenomenon that an effect occurs spontaneously, i.e., in absence of any of the causes that are modeled explicitly. In the leaky NoisyOR gate, p i (i 0) no longer represents the probability that X i causes y given that all other parent variables are absent, but rather the probability that Y is present when X i is present and every other explicit parent causes (all the X j s such that j i) are absent. An alternative way of eliciting the parameters of a leaky NoisyOR gate is given in [Díez, 1993]. Díez defined p i as the probability that Y will be true if X i is present and every other parent of Y including unmodeled causes (the leak) are absent: 1 p i = 1 p i 1 p 0. (5) His method amounts essentially to asking the expert for these parameters p i. Converting the parameters p i to p i is straightforward: p i = p i + (1 p i)p 0. (6) It follows that the probability of Y given a subset of parent variables X is given in the leaky NoisyOR gate by the following formula: n 1 p i p(y X) = (1 (1 p 0 )). (7) 1 p 0 i=1 The difference between the two proposals has to do with the leak variable. While Henrion s parameters p i assume that the expert s answer includes a combined influence of the parent cause in question and the leak, Díez s parameters p i explicitly refer to the mechanism between the parent cause in question and the effect with the leak absent. Conversion between the two parameters is straightforward using Equation 6. If the NoisyOR parameters have to be elicited by an domain expert, Díez s definition is more convenient, since the question to be answered is more intuitive for the expert: What is the probability that the effect y will occur when you know that all modeled and unmodeled causes X are absent? Henrion s definition will be more convenient if the parameters are learned from data. The NoisyOR gate can be generalized if it is used for multivalued variables. These variables are allowed to have more than two states. However, these variables still contain a distinguished state. The difference is that the CPT of the effect variable Y is the deterministic MAX function instead of the deterministic OR function. Therefore it is called Noisy MAX gate or, if the leak probability is included, leaky Noisy MAX gate. In our proposed methodology we set the leak probability distribution of an effect variable p 0 (Y ) by default to 0 for the nondistinguished states y i,..., y n 1 and 1 for the distinguished state y: { 1 if Y = y p 0 (Y ) =. (8) 0 otherwise In this way the interaction among variables is by default NoisyMAX, it becomes leaky NoisyMAX if an user defines a specific leak probability distribution for the effect variable. To give an example of the reduction of the number of parameters that have to be elicited by the user we define the number of states of a parent variable X i as n Xi and the number of states of an effect variable y as n y. The total number of parameters N that have to be elicited by the model builder using leaky NoisyMAX becomes: N = n i=1 (n X i 1)(n y 1) + 1, compared to the exponential number of parameters n using CPT: N = n y i=1 n X i. Using NoisyOR and NoisyMAX gates for some (not all interactions could be approximated by these gates) of the conditional distributions in the HEPARII model [Oniśko et al., 2001] not only reduced the number of parameters that had to be elicited from 3714 to 1488 but also improved the diagnostic performance of the model at that time. 5 GeNIeRate We embedded the ideas presented in Sections 3 and 4 in an interactive environment for generation of diagnostic BN models that we call GeNIeRate. GeNIeRate is a member of the family of software developed at the Decision Systems Laboratory (DSL) of the University of Pittsburgh. This software is developed for the purpose of probabilistic modeling with special extensions for diagnostic inference, such as rank ordering, tests, and case management. The main part of this family is SMILE (Structural Modeling Reasoning, and Learning Engine) which is a library of C++ classes implementing graphical probabilistic and decisiontheoretic models. In order to use SMILE in other programming languages some wrappers are developed: jsmile for Java, SMILE.NET for a Microsoft.NET environment and pocketsmile for the Pocket PC. Next to the libraries, there are some applications with a GUI using these libraries. GeNIe
5 is the main application, it is a development environment for building graphical decision models. Other applications are ImaGeNIe which is a general purpose model building interface and GeNIeRate which is discussed in this paper. Ge NIeRate is developed in Java therefore it uses jsmile as its modeling library. All the software can be downloaded at: genie. The main goal of GeNIeRate is to support a user in building the simplified diagnostic BN3M models in a fast and intuitive way. In order to keep the model building process simple Ge NIeRate contains three different screens, representing three steps for building a BN3M model: (1) add General information, (2) add the Variables, and (3) add the Relations and the probabilities (GVR). We will discuss these three steps in this section and support our description with different screenshots of GeNIeRate. In the first tab, the user can define general model information: name, domain, and description of the model. Since not all designed models will contain context variables, these variables can be switched on or off. Furthermore, feedback can be enabled which amounts to feedback given by GeNIeRate for each action of the user. If the user is not a BN expert, this feedback will help with natural language dialogs to get familiar with this technique and give the user some insight about the causality of the graphical structure of his model. Figure 4 shows a part of this first tabbed screen. In the second tab, the variables of the model can be added, edited or deleted. For each type of variable (context, fault and evidence) there is a tree structure representing the variables of that type added to the model. Figure 5 shows a part of this screen containing the different tree structures. GeNIeRate uses the concept of a system to divide large diagnostic models into manageable parts. A system is typically a module of a device that, while interconnected with other modules, can be thought of in separation from the rest of the device. Typically devices, whether natural or artificial, are composed hierarchically and have clearly identifiable modules, for example electrical system, power train, break, or steering system in an automobile. Systems are the largest entities through which the user can interact with a model, i.e., the users can view a system s variables after selecting that system and add variables to it or remove variables from it. This supports working on a specific part of the model, ignoring the rest of the model and, since humans can only focus on a small number of things at a time, limiting the amount of information presented to the user. Figure 5: Part of the model variables panel Furthermore, documentation can be added to the variables and the states of the variables. This documentation can be a treatment for each state of a fault variable or a question for each context or evidence variable. The documented questions and treatments can later be used when the model is applied in diagnostic software. Answers to these questions can be used as evidence for the belief updating (Section 2) to calculate the most probable fault(s). After this calculation, the documented treatment of that fault can be presented to help the user make a decision how to fix his problem. The last screen allows for creation of relations among the variables. When, for example, a fault variable is selected, context variables can be added as parents or evidence variables as children of that fault. After relations are created, the graph at the right side of the screen shows the relevant part of the graph (see Figure 6). Figure 4: Part of the model info panel Figure 6: Part of the model relations panel A system can be selected to show only the relations of the selected variable with the variables within that system. Navigating the graph can be done by clicking on the top or the bottom of a node to show the parents or children of that node respectively. All the parameters can be defined by doubleclicking on a node or an arc in the graph. If a node has parents, the leakprobability distribution for that variable can be defined, otherwise the prior probability distribution can be defined. If an arc in the graph is selected the NoisyMAX parameters for the relation represented by that arc can be defined. After the diagnostic BN model has been created, it can be saved and used for applying diagnosis. This can be done after loading the network into GeNIe, which contains a tool for
6 Research. We would like to thank Mr. M. D. Campbell whose suggestions have improved GeNIeRate enormously. Figure 7: GeNIeRate architecture and interaction with other DSL software diagnosis. The network could also be loaded in user applications using (j)smile or other BN building software (Figure 7). 6 Experiment A qualitative evaluation of our methodology and GeNIeRate was performed by Mr. M. D. Campbell, a professional consultant with 5 years of practical experience in building diagnostic BN models. GeNIeRate was received warmly and Mr. Campbell estimated that this methodology will be of great help especially in building large diagnostic models. He mentioned that a domain expert usually spends 90% of the time in analyzing and understanding his diagnostic problem as a probabilistic model. However, his estimate was that our approach will reduce the model building time for an inexperienced Bayesian network model builder. We are planning to conduct an empirical study for the performance of our methodology with the HEPARII model described in the introduction. HEPARII is a Bayesian network consisting of 73 variables applied for diagnosing liver disorders. The construction of this model took about 300 hours of which 50 hours were spent with domain experts. HEPARII was built in collaboration between DSL, the Bialystok University of Technology and the Medical Center for Postgraduate Education in Warsaw, Poland. We are planning to rebuild the HEPARII model with our methodology and compare the performance of this new HEPARIIBN3M to the original HEPARII. We will also look at the time it took to construct the new network. We hope to be able to present the results of this study at the workshop. 7 Concluding remarks This paper has presented a methodology for building large diagnostic Bayesian networks. An implementation of the methodology was received well by a professional consultant in building diagnostic Bayesian networks. His estimate is that using this methodology for building large diagnostic BN models will reduce the model building time especially for an inexperienced BN model builder. Acknowledgments This research was supported by the Air Force Office of Scientific Research under grant F , and by Intel References [Cooper, 1990] Gregory F. Cooper. The computational complexity of probabilistic inference using Bayesian belief networks. Artificial Intelligence, 42(2 3): , March [Dagum and Luby, 1997] Paul Dagum and Michael Luby. An optimal approximation algorithm for Bayesian inference. Artificial Intelligence, 93:1 27, [Díez, 1993] F. J. Díez. Parameter adjustement in Bayes networks. The generalized noisy OR gate. In Proceedings of the 9th Conference on Uncertainty in Artificial Intelligence, pages , Washington D.C., Morgan Kaufmann, San Mateo, CA. [Good, 1961] I. Good. A causal calculus (I). British Journal of Philosophy of Science, 11: , [Henrion, 1989] M. Henrion. Some practical issues in constructing belief networks. In L. N. Kanal, T. S. Levitt, and J. F. Lemmer, editors, Uncertainty in Artificial Intelligence 3, pages NorthHolland, Amsterdam, [Miller et al., 1982] Randolph A. Miller, Harry E. Pople, Jr., and Jack D. Myers. Internist 1, an experimental computerbased diagnostic consultant for general internal medicine. New England Journal of Medicine, 307(8): , August [Oniśko et al., 2001] Agnieszka Oniśko, Marek J. Druzdzel, and Hanna Wasyluk. Learning bayesian network parameters from small data sets: Application of noisyor gates. International Journal of Approximate Reasoning, 27(2): , [Pearl, 1986] Judea Pearl. Fusion, propagation, and structuring in belief networks. Artificial Intelligence, 29(3): , [Pearl, 1988] Judea Pearl. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann Publishers, Inc., San Mateo, CA, [Shwe et al., 1991] M.A. Shwe, B. Middleton, D.E. Heckerman, M. Henrion, E.J. Horvitz, H.P. Lehmann, and G.F. Cooper. Probabilistic diagnosis using a reformulation of the INTERNIST 1/QMR knowledge base: I. The probabilistic model and inference algorithms. Methods of Information in Medicine, 30(4): , [Skaanning, 2000] Claus Skaanning. A knowledge acquisition tool for Bayesiannetwork troubleshooters. In Uncertainty in Artificial Intelligence: Proceedings of the Sixteenth Conference (UAI2000), pages , San Francisco, CA, Morgan Kaufmann Publishers. [Zagorecki and Druzdzel, 2004] Adam Zagorecki and Marek Druzdzel. Knowledge engineering for building bayesian networks: How common are noisymax distributions in practice?, 2004.
In Proceedings of the Eleventh Conference on Biocybernetics and Biomedical Engineering, pages 842846, Warsaw, Poland, December 24, 1999
In Proceedings of the Eleventh Conference on Biocybernetics and Biomedical Engineering, pages 842846, Warsaw, Poland, December 24, 1999 A Bayesian Network Model for Diagnosis of Liver Disorders Agnieszka
More informationA Bayesian Network Model for Diagnosis of Liver Disorders Agnieszka Onisko, M.S., 1,2 Marek J. Druzdzel, Ph.D., 1 and Hanna Wasyluk, M.D.,Ph.D.
Research Report CBMI9927, Center for Biomedical Informatics, University of Pittsburgh, September 1999 A Bayesian Network Model for Diagnosis of Liver Disorders Agnieszka Onisko, M.S., 1,2 Marek J. Druzdzel,
More informationA Probabilistic Causal Model for Diagnosis of Liver Disorders
Intelligent Information Systems VII Proceedings of the Workshop held in Malbork, Poland, June 1519, 1998 A Probabilistic Causal Model for Diagnosis of Liver Disorders Agnieszka Oniśko 1, Marek J. Druzdzel
More informationAre Bayesian Networks Sensitive to Precision of Their Parameters?
Intelligent Information Systems 2008 ISBN 9788360434444, pages 35 44 Are Bayesian Networks Sensitive to Precision of Their Parameters? Marek J. Drużdżel 1,2 and Agnieszka Oniśko 1,3 1 Faculty of Computer
More informationAnother Look at Sensitivity of Bayesian Networks to Imprecise Probabilities
Another Look at Sensitivity of Bayesian Networks to Imprecise Probabilities Oscar Kipersztok Mathematics and Computing Technology Phantom Works, The Boeing Company P.O.Box 3707, MC: 7L44 Seattle, WA 98124
More informationReal Time Traffic Monitoring With Bayesian Belief Networks
Real Time Traffic Monitoring With Bayesian Belief Networks Sicco Pier van Gosliga TNO Defence, Security and Safety, P.O.Box 96864, 2509 JG The Hague, The Netherlands +31 70 374 02 30, sicco_pier.vangosliga@tno.nl
More informationAgenda. Interface Agents. Interface Agents
Agenda Marcelo G. Armentano Problem Overview Interface Agents Probabilistic approach Monitoring user actions Model of the application Model of user intentions Example Summary ISISTAN Research Institute
More informationArtificial Intelligence Mar 27, Bayesian Networks 1 P (T D)P (D) + P (T D)P ( D) =
Artificial Intelligence 15381 Mar 27, 2007 Bayesian Networks 1 Recap of last lecture Probability: precise representation of uncertainty Probability theory: optimal updating of knowledge based on new information
More informationCausal Independence for Probability Assessment and Inference Using Bayesian Networks
Causal Independence for Probability Assessment and Inference Using Bayesian Networks David Heckerman and John S. Breese Copyright c 1993 IEEE. Reprinted from D. Heckerman, J. Breese. Causal Independence
More informationChapter 14 Managing Operational Risks with Bayesian Networks
Chapter 14 Managing Operational Risks with Bayesian Networks Carol Alexander This chapter introduces Bayesian belief and decision networks as quantitative management tools for operational risks. Bayesian
More informationCS 188: Artificial Intelligence. Probability recap
CS 188: Artificial Intelligence Bayes Nets Representation and Independence Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Conditional probability
More informationDecisionTheoretic Troubleshooting: A Framework for Repair and Experiment
DecisionTheoretic Troubleshooting: A Framework for Repair and Experiment John S. Breese David Heckerman Abstract We develop and extend existing decisiontheoretic methods for troubleshooting a nonfunctioning
More informationWhy is diagnosis using belief networks insensitive to imprecision in probabilities?
Imprecision in diagnostic belief nets 307 Why is diagnosis using belief networks insensitive to imprecision in probabilities? Max Henrion, Malcolm Pradhan t, Brendan Del Favero *, Kurt Huang t, Gregory
More informationA Bayesian Approach for online max auditing of Dynamic Statistical Databases
A Bayesian Approach for online max auditing of Dynamic Statistical Databases Gerardo Canfora Bice Cavallo University of Sannio, Benevento, Italy, {gerardo.canfora,bice.cavallo}@unisannio.it ABSTRACT In
More informationModeling Bayesian Networks for Autonomous Diagnosis of Web Services
Modeling Bayesian Networks for Autonomous Diagnosis of Web Services Haiqin Wang, Guijun Wang, Alice Chen, Changzhou Wang Casey K Fung, Stephen A Uczekaj, Rodolfo A Santiago Boeing Phantom Works P.O.Box
More informationThe CertaintyFactor Model
The CertaintyFactor Model David Heckerman Departments of Computer Science and Pathology University of Southern California HMR 204, 2025 Zonal Ave Los Angeles, CA 90033 dheck@sumexaim.stanford.edu 1 Introduction
More informationI I I I I I I I I I I I I I I I I I I
A ABSTRACT Method for Using Belief Networks as nfluence Diagrams Gregory F. Cooper Medical Computer Science Group Medical School Office Building Stanford University Stanford, CA 94305 This paper demonstrates
More informationStudy Manual. Probabilistic Reasoning. Silja Renooij. August 2015
Study Manual Probabilistic Reasoning 2015 2016 Silja Renooij August 2015 General information This study manual was designed to help guide your self studies. As such, it does not include material that is
More informationThe Basics of Graphical Models
The Basics of Graphical Models David M. Blei Columbia University October 3, 2015 Introduction These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. Many figures
More informationIncorporating Evidence in Bayesian networks with the Select Operator
Incorporating Evidence in Bayesian networks with the Select Operator C.J. Butz and F. Fang Department of Computer Science, University of Regina Regina, Saskatchewan, Canada SAS 0A2 {butz, fang11fa}@cs.uregina.ca
More informationContact Recommendations from Aggegrated OnLine Activity
Contact Recommendations from Aggegrated OnLine Activity Abigail Gertner, Justin Richer, and Thomas Bartee The MITRE Corporation 202 Burlington Road, Bedford, MA 01730 {gertner,jricher,tbartee}@mitre.org
More informationDecision Support Systems
Decision Support Systems Marek J. Druzdzel School of Information Sciences and Intelligent Systems Program, University of Pittsburgh, Pittsburgh, Pennsylvania, U.S.A. Faculty of Computer Science, Bialystok
More informationDecision Support Systems
Decision Support Systems Marek J. Druzdzel and Roger R. Flynn Decision Systems Laboratory School of Information Sciences and Intelligent Systems Program University of Pittsburgh Pittsburgh, PA 15260 {marek,flynn}@sis.pitt.edu
More informationLogic, Probability and Learning
Logic, Probability and Learning Luc De Raedt luc.deraedt@cs.kuleuven.be Overview Logic Learning Probabilistic Learning Probabilistic Logic Learning Closely following : Russell and Norvig, AI: a modern
More informationDETERMINING THE CONDITIONAL PROBABILITIES IN BAYESIAN NETWORKS
Hacettepe Journal of Mathematics and Statistics Volume 33 (2004), 69 76 DETERMINING THE CONDITIONAL PROBABILITIES IN BAYESIAN NETWORKS Hülya Olmuş and S. Oral Erbaş Received 22 : 07 : 2003 : Accepted 04
More informationLearning diagnostic diagrams in transportbased datacollection systems
University of Wollongong Research Online Faculty of Engineering and Information Sciences  Papers Faculty of Engineering and Information Sciences 2014 Learning diagnostic diagrams in transportbased datacollection
More information13.3 Inference Using Full Joint Distribution
191 The probability distribution on a single variable must sum to 1 It is also true that any joint probability distribution on any set of variables must sum to 1 Recall that any proposition a is equivalent
More informationAn Empirical Evaluation of Bayesian Networks Derived from Fault Trees
An Empirical Evaluation of Bayesian Networks Derived from Fault Trees Shane Strasser and John Sheppard Department of Computer Science Montana State University Bozeman, MT 59717 {shane.strasser, john.sheppard}@cs.montana.edu
More informationChapter 28. Bayesian Networks
Chapter 28. Bayesian Networks The Quest for Artificial Intelligence, Nilsson, N. J., 2009. Lecture Notes on Artificial Intelligence, Spring 2012 Summarized by Kim, ByoungHee and Lim, ByoungKwon Biointelligence
More informationAdditive BeliefNetwork Models
Additive BeliefNetwork Models 91 Additive BeliefNetwork Models Paul Dagum Section on Medical Informatics Stanford University School of Medicine and Rockwell Palo Alto Laboratory 444 High Street Palo
More informationEstimating Software Reliability In the Absence of Data
Estimating Software Reliability In the Absence of Data Joanne Bechta Dugan (jbd@virginia.edu) Ganesh J. Pai (gpai@virginia.edu) Department of ECE University of Virginia, Charlottesville, VA NASA OSMA SAS
More informationSYSM 6304: Risk and Decision Analysis Lecture 5: Methods of Risk Analysis
SYSM 6304: Risk and Decision Analysis Lecture 5: Methods of Risk Analysis M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu October 17, 2015 Outline
More informationBayesian Networks and Classifiers in Project Management
Bayesian Networks and Classifiers in Project Management Daniel Rodríguez 1, Javier Dolado 2 and Manoranjan Satpathy 1 1 Dept. of Computer Science The University of Reading Reading, RG6 6AY, UK drg@ieee.org,
More informationBayesian networks  Timeseries models  Apache Spark & Scala
Bayesian networks  Timeseries models  Apache Spark & Scala Dr John Sandiford, CTO Bayes Server Data Science London Meetup  November 2014 1 Contents Introduction Bayesian networks Latent variables Anomaly
More informationIntelligent Systems: Reasoning and Recognition. Uncertainty and Plausible Reasoning
Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2015/2016 Lesson 12 25 march 2015 Uncertainty and Plausible Reasoning MYCIN (continued)...2 Backward
More informationThe Optimality of Naive Bayes
The Optimality of Naive Bayes Harry Zhang Faculty of Computer Science University of New Brunswick Fredericton, New Brunswick, Canada email: hzhang@unbca E3B 5A3 Abstract Naive Bayes is one of the most
More informationBaRT: A Bayesian Reasoning Tool for Knowledge Based Systems
BaRT: A Bayesian Reasoning Tool for Knowledge Based Systems Lashon B. Booker, Naveen Hota'", and Connie Loggia Ramsey Navy Center for Applied Research in Artificial Intelligence Code 5510 N a. val Research
More informationArtificial Intelligence. Conditional probability. Inference by enumeration. Independence. Lesson 11 (From Russell & Norvig)
Artificial Intelligence Conditional probability Conditional or posterior probabilities e.g., cavity toothache) = 0.8 i.e., given that toothache is all I know tation for conditional distributions: Cavity
More informationApplication of Adaptive Probing for Fault Diagnosis in Computer Networks 1
Application of Adaptive Probing for Fault Diagnosis in Computer Networks 1 Maitreya Natu Dept. of Computer and Information Sciences University of Delaware, Newark, DE, USA, 19716 Email: natu@cis.udel.edu
More informationHealthcare Measurement Analysis Using Data mining Techniques
www.ijecs.in International Journal Of Engineering And Computer Science ISSN:23197242 Volume 03 Issue 07 July, 2014 Page No. 70587064 Healthcare Measurement Analysis Using Data mining Techniques 1 Dr.A.Shaik
More informationThe Compilation of Decision Models
The Compilation of Decision Models David E. Heckerman Medical Computer Science Group Knowledge Systems Laboratory Departments of Computer Science and Medicine Stanford, California 94305 John S. Breese
More informationMONITORING AND DIAGNOSIS OF A MULTISTAGE MANUFACTURING PROCESS USING BAYESIAN NETWORKS
MONITORING AND DIAGNOSIS OF A MULTISTAGE MANUFACTURING PROCESS USING BAYESIAN NETWORKS Eric Wolbrecht Bruce D Ambrosio Bob Paasch Oregon State University, Corvallis, OR Doug Kirby Hewlett Packard, Corvallis,
More informationModelbased Synthesis. Tony O Hagan
Modelbased Synthesis Tony O Hagan Stochastic models Synthesising evidence through a statistical model 2 Evidence Synthesis (Session 3), Helsinki, 28/10/11 Graphical modelling The kinds of models that
More informationLecture 2: Introduction to belief (Bayesian) networks
Lecture 2: Introduction to belief (Bayesian) networks Conditional independence What is a belief network? Independence maps (Imaps) January 7, 2008 1 COMP526 Lecture 2 Recall from last time: Conditional
More informationDecision Trees and Networks
Lecture 21: Uncertainty 6 Today s Lecture Victor R. Lesser CMPSCI 683 Fall 2010 Decision Trees and Networks Decision Trees A decision tree is an explicit representation of all the possible scenarios from
More informationA Tool for Enterprise Architecture Analysis
A Tool for Enterprise Architecture Analysis Pontus Johnson, Erik Johansson, Teodor Sommestad, Johan Ullberg Department of Industrial Information and Control Systems Royal Institute of Technology (KTH)
More informationTesting LTL Formula Translation into Büchi Automata
Testing LTL Formula Translation into Büchi Automata Heikki Tauriainen and Keijo Heljanko Helsinki University of Technology, Laboratory for Theoretical Computer Science, P. O. Box 5400, FIN02015 HUT, Finland
More informationAn Influence Diagram Model for Detecting Credit Card Fraud
An Influence Diagram Model for Detecting Credit Card Fraud Barry R. Cobb Virginia Military Institute cobbbr@vmi.edu Abstract A hybrid influence diagram is a compact graphical and numerical representation
More informationInformatics 2D Reasoning and Agents Semester 2, 201516
Informatics 2D Reasoning and Agents Semester 2, 201516 Alex Lascarides alex@inf.ed.ac.uk Lecture 29 Decision Making Under Uncertainty 24th March 2016 Informatics UoE Informatics 2D 1 Where are we? Last
More informationInfluence Discovery in Semantic Networks: An Initial Approach
2014 UKSimAMSS 16th International Conference on Computer Modelling and Simulation Influence Discovery in Semantic Networks: An Initial Approach Marcello Trovati and Ovidiu Bagdasar School of Computing
More informationPrediction of Heart Disease Using Naïve Bayes Algorithm
Prediction of Heart Disease Using Naïve Bayes Algorithm R.Karthiyayini 1, S.Chithaara 2 Assistant Professor, Department of computer Applications, Anna University, BIT campus, Tiruchirapalli, Tamilnadu,
More informationFinding the M Most Probable Configurations Using Loopy Belief Propagation
Finding the M Most Probable Configurations Using Loopy Belief Propagation Chen Yanover and Yair Weiss School of Computer Science and Engineering The Hebrew University of Jerusalem 91904 Jerusalem, Israel
More informationHUGIN Expert White Paper
HUGIN Expert White Paper White Paper Company Profile History HUGIN Expert Today HUGIN Expert vision Our Logo  the Raven Bayesian Networks What is a Bayesian network? The theory behind Bayesian networks
More informationADVANCED MAINTENANCE USING CAUSAL NETWORKS
ADVANCED MAINTENANCE USING CAUSAL NETWORKS Charles J. Sitter, Rockwell Collins, Cedar Rapids, Iowa Gregory M. Provan, Rockwell Science Center, Thousand Oaks, California Abstract We present a flexible,
More informationINTEGER PROGRAMMING. Integer Programming. Prototype example. BIP model. BIP models
Integer Programming INTEGER PROGRAMMING In many problems the decision variables must have integer values. Example: assign people, machines, and vehicles to activities in integer quantities. If this is
More informationCredal Networks for Operational Risk Measurement and Management
Credal Networks for Operational Risk Measurement and Management Alessandro Antonucci, Alberto Piatti, and Marco Zaffalon Istituto Dalle Molle di Studi sull Intelligenza Artificiale Via Cantonale, Galleria
More informationHELP DESK SYSTEMS. Using CaseBased Reasoning
HELP DESK SYSTEMS Using CaseBased Reasoning Topics Covered Today What is HelpDesk? Components of HelpDesk Systems Types Of HelpDesk Systems Used Need for CBR in HelpDesk Systems GE Helpdesk using ReMind
More informationBayesian probability theory
Bayesian probability theory Bruno A. Olshausen arch 1, 2004 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using probability. The foundations
More informationImproving Decision Making and Managing Knowledge
Improving Decision Making and Managing Knowledge Decision Making and Information Systems Information Requirements of Key DecisionMaking Groups in a Firm Senior managers, middle managers, operational managers,
More informationCORRELATED TO THE SOUTH CAROLINA COLLEGE AND CAREERREADY FOUNDATIONS IN ALGEBRA
We Can Early Learning Curriculum PreK Grades 8 12 INSIDE ALGEBRA, GRADES 8 12 CORRELATED TO THE SOUTH CAROLINA COLLEGE AND CAREERREADY FOUNDATIONS IN ALGEBRA April 2016 www.voyagersopris.com Mathematical
More informationChapter 3: Data Mining Driven Learning Apprentice System for Medical Billing Compliance
Chapter 3: Data Mining Driven Learning Apprentice System for Medical Billing Compliance 3.1 Introduction This research has been conducted at back office of a medical billing company situated in a custom
More informationProbabilistic Evidential Reasoning with Symbolic Argumentation for Space Situation Awareness
AIAA Infotech@Aerospace 2010 2022 April 2010, Atlanta, Georgia AIAA 20103481 Probabilistic Evidential Reasoning with Symbolic Argumentation for Space Situation Awareness Glenn Takata 1 and Joe Gorman
More informationA crash course in probability and Naïve Bayes classification
Probability theory A crash course in probability and Naïve Bayes classification Chapter 9 Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. s: A person s
More informationCHAPTER 2 Estimating Probabilities
CHAPTER 2 Estimating Probabilities Machine Learning Copyright c 2016. Tom M. Mitchell. All rights reserved. *DRAFT OF January 24, 2016* *PLEASE DO NOT DISTRIBUTE WITHOUT AUTHOR S PERMISSION* This is a
More informationBayesian Networks. Mausam (Slides by UWAI faculty)
Bayesian Networks Mausam (Slides by UWAI faculty) Bayes Nets In general, joint distribution P over set of variables (X 1 x... x X n ) requires exponential space for representation & inference BNs provide
More informationCourse: Model, Learning, and Inference: Lecture 5
Course: Model, Learning, and Inference: Lecture 5 Alan Yuille Department of Statistics, UCLA Los Angeles, CA 90095 yuille@stat.ucla.edu Abstract Probability distributions on structured representation.
More informationLargeSample Learning of Bayesian Networks is NPHard
Journal of Machine Learning Research 5 (2004) 1287 1330 Submitted 3/04; Published 10/04 LargeSample Learning of Bayesian Networks is NPHard David Maxwell Chickering David Heckerman Christopher Meek Microsoft
More informationBayesian Networks. Read R&N Ch. 14.114.2. Next lecture: Read R&N 18.118.4
Bayesian Networks Read R&N Ch. 14.114.2 Next lecture: Read R&N 18.118.4 You will be expected to know Basic concepts and vocabulary of Bayesian networks. Nodes represent random variables. Directed arcs
More informationLecture 11: Graphical Models for Inference
Lecture 11: Graphical Models for Inference So far we have seen two graphical models that are used for inference  the Bayesian network and the Join tree. These two both represent the same joint probability
More informationDecision support software for probabilistic risk assessment using Bayesian networks
Decision support software for probabilistic risk assessment using Bayesian networks Norman Fenton and Martin Neil THIS IS THE AUTHOR S POSTPRINT VERSION OF THE FOLLOWING CITATION (COPYRIGHT IEEE) Fenton,
More information8. KNOWLEDGE BASED SYSTEMS IN MANUFACTURING SIMULATION
 18. KNOWLEDGE BASED SYSTEMS IN MANUFACTURING SIMULATION 8.1 Introduction 8.1.1 Summary introduction The first part of this section gives a brief overview of some of the different uses of expert systems
More informationData Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1
Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2011 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields
More informationCOMPUTATIONAL METHODS FOR A MATHEMATICAL THEORY OF EVIDENCE
COMPUTATIONAL METHODS FOR A MATHEMATICAL THEORY OF EVIDENCE Jeffrey A. Barnett USC/lnformation Sciences Institute ABSTRACT: Many knowledgebased expert systems employ numerical schemes to represent evidence,
More informationSolving Hybrid Markov Decision Processes
Solving Hybrid Markov Decision Processes Alberto Reyes 1, L. Enrique Sucar +, Eduardo F. Morales + and Pablo H. Ibargüengoytia Instituto de Investigaciones Eléctricas Av. Reforma 113, Palmira Cuernavaca,
More informationA Comparison of System Dynamics (SD) and Discrete Event Simulation (DES) Al Sweetser Overview.
A Comparison of System Dynamics (SD) and Discrete Event Simulation (DES) Al Sweetser Andersen Consultng 1600 K Street, N.W., Washington, DC 200062873 (202) 8628080 (voice), (202) 7854689 (fax) albert.sweetser@ac.com
More informationSOLiD System accuracy with the Exact Call Chemistry module
WHITE PPER 55 Series SOLiD System SOLiD System accuracy with the Exact all hemistry module ONTENTS Principles of Exact all hemistry Introduction Encoding of base sequences with Exact all hemistry Demonstration
More informationBounded Treewidth in Knowledge Representation and Reasoning 1
Bounded Treewidth in Knowledge Representation and Reasoning 1 Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität Wien Luminy, October 2010 1 Joint work with G.
More informationExperiments in Web Page Classification for Semantic Web
Experiments in Web Page Classification for Semantic Web Asad Satti, Nick Cercone, Vlado Kešelj Faculty of Computer Science, Dalhousie University Email: {rashid,nick,vlado}@cs.dal.ca Abstract We address
More informationScheduling Home Health Care with Separating Benders Cuts in Decision Diagrams
Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams André Ciré University of Toronto John Hooker Carnegie Mellon University INFORMS 2014 Home Health Care Home health care delivery
More informationOutline. Probability. Random Variables. Joint and Marginal Distributions Conditional Distribution Product Rule, Chain Rule, Bayes Rule.
Probability 1 Probability Random Variables Outline Joint and Marginal Distributions Conditional Distribution Product Rule, Chain Rule, Bayes Rule Inference Independence 2 Inference in Ghostbusters A ghost
More informationRunning Head: STRUCTURE INDUCTION IN DIAGNOSTIC REASONING 1. Structure Induction in Diagnostic Causal Reasoning. Development, Berlin, Germany
Running Head: STRUCTURE INDUCTION IN DIAGNOSTIC REASONING 1 Structure Induction in Diagnostic Causal Reasoning Björn Meder 1, Ralf Mayrhofer 2, & Michael R. Waldmann 2 1 Center for Adaptive Behavior and
More informationDoctor of Philosophy in Computer Science
Doctor of Philosophy in Computer Science Background/Rationale The program aims to develop computer scientists who are armed with methods, tools and techniques from both theoretical and systems aspects
More informationExpert Systems. A knowledgebased approach to intelligent systems. Intelligent System. Motivation. Approaches & Ingredients. Terminology.
Motivation Expert Systems A knowledgebased approach to intelligent systems Intelligent System Peter Lucas Department of Information and Knowledge Systems Institute for Computing and Information Sciences
More informationUncertainty and Decisions in Medical Informatics 1
Uncertainty and Decisions in Medical Informatics 1 Peter Szolovits, Ph.D. Laboratory for Computer Science Massachusetts Institute of Technology 545 Technology Square Cambridge, Massachusetts 02139, USA
More informationExtracting Decision Trees from Diagnostic Bayesian Networks to Guide Test Selection
Extracting Decision Trees from Diagnostic Bayesian Networks to Guide Test Selection Scott Wahl, John W. Sheppard Department of Computer Science, Montana State University, Bozeman, MT, 5977, USA wahl@cs.montana.edu
More informationTutorial on variational approximation methods. Tommi S. Jaakkola MIT AI Lab
Tutorial on variational approximation methods Tommi S. Jaakkola MIT AI Lab tommi@ai.mit.edu Tutorial topics A bit of history Examples of variational methods A brief intro to graphical models Variational
More informationCompression algorithm for Bayesian network modeling of binary systems
Compression algorithm for Bayesian network modeling of binary systems I. Tien & A. Der Kiureghian University of California, Berkeley ABSTRACT: A Bayesian network (BN) is a useful tool for analyzing the
More informationChapter 1. NP Completeness I. 1.1. Introduction. By Sariel HarPeled, December 30, 2014 1 Version: 1.05
Chapter 1 NP Completeness I By Sariel HarPeled, December 30, 2014 1 Version: 1.05 "Then you must begin a reading program immediately so that you man understand the crises of our age," Ignatius said solemnly.
More informationPutting MAIDs in Context
Putting MAIs in Context Mark Crowley epartment of Computer Science University of British Columbia Vancouver, BC V6T 1Z4 crowley@cs.ubc.ca Abstract MultiAgent Influence iagrams (MAIs) are a compact modelling
More informationMODELING CLIENT INFORMATION SYSTEMS USING BAYESIAN NETWORKS. Main results of the PhD Thesis. Gábor Vámos
Budapest University of Technology and Economics Faculty of Electrical Engineering and Informatics Department of Control Engineering and Information Technology MODELING CLIENT INFORMATION SYSTEMS USING
More informationRAVEN: A GUI and an Artificial Intelligence Engine in a Dynamic PRA Framework
INL/CON1328360 PREPRINT RAVEN: A GUI and an Artificial Intelligence Engine in a Dynamic PRA Framework ANS Annual Meeting C. Rabiti D. Mandelli A. Alfonsi J. J. Cogliati R. Kinoshita D. Gaston R. Martineau
More informationA Webbased Intelligent Tutoring System for Computer Programming
A Webbased Intelligent Tutoring System for Computer Programming C.J. Butz, S. Hua, R.B. Maguire Department of Computer Science University of Regina Regina, Saskatchewan, Canada S4S 0A2 Email: {butz, huash111,
More informationThe Application of Bayesian Optimization and Classifier Systems in Nurse Scheduling
The Application of Bayesian Optimization and Classifier Systems in Nurse Scheduling Proceedings of the 8th International Conference on Parallel Problem Solving from Nature (PPSN VIII), LNCS 3242, pp 581590,
More informationRepresentation of Electronic Mail Filtering Profiles: A User Study
Representation of Electronic Mail Filtering Profiles: A User Study Michael J. Pazzani Department of Information and Computer Science University of California, Irvine Irvine, CA 92697 +1 949 824 5888 pazzani@ics.uci.edu
More informationProbability, Conditional Independence
Probability, Conditional Independence June 19, 2012 Probability, Conditional Independence Probability Sample space Ω of events Each event ω Ω has an associated measure Probability of the event P(ω) Axioms
More informationChapter 7.30 Retrieving Medical Records Using Bayesian Networks
2274 Chapter 7.30 Retrieving Medical Records Using Bayesian Networks Luis M. de Campos Universidad de Granada, Spain Juan M. Fernández Luna Universidad de Granada, Spain Juan F. Huete Universidad de Granada,
More informationInclude Requirement (R)
Using Inuence Diagrams in Software Change Management Colin J. Burgess Department of Computer Science, University of Bristol, Bristol, BS8 1UB, England Ilesh Dattani, Gordon Hughes and John H.R. May Safety
More informationOptimization of Image Search from Photo Sharing Websites Using Personal Data
Optimization of Image Search from Photo Sharing Websites Using Personal Data Mr. Naeem Naik Walchand Institute of Technology, Solapur, India Abstract The present research aims at optimizing the image search
More informationPart III: Machine Learning. CS 188: Artificial Intelligence. Machine Learning This Set of Slides. Parameter Estimation. Estimation: Smoothing
CS 188: Artificial Intelligence Lecture 20: Dynamic Bayes Nets, Naïve Bayes Pieter Abbeel UC Berkeley Slides adapted from Dan Klein. Part III: Machine Learning Up until now: how to reason in a model and
More informationA Hybrid Anytime Algorithm for the Construction of Causal Models From Sparse Data.
142 A Hybrid Anytime Algorithm for the Construction of Causal Models From Sparse Data. Denver Dash t Department of Physics and Astronomy and Decision Systems Laboratory University of Pittsburgh Pittsburgh,
More information