Bayesian Networks. Mausam (Slides by UW-AI faculty)
|
|
|
- Mitchell Watts
- 10 years ago
- Views:
Transcription
1 Bayesian Networks Mausam (Slides by UW-AI faculty)
2 Bayes Nets In general, joint distribution P over set of variables (X 1 x... x X n ) requires exponential space for representation & inference BNs provide a graphical representation of conditional independence relations in P usually quite compact requires assessment of fewer parameters, those being quite natural (e.g., causal) efficient (usually) inference: query answering and belief update D. Weld and D. Fox 2
3 Back at the dentist s Topology of network encodes conditional independence assertions: Weather is independent of the other variables Toothache and Catch are conditionally independent of each other given Cavity
4 Syntax a set of nodes, one per random variable a directed, acyclic graph (link "directly influences") a conditional distribution for each node given its parents: P (X i Parents (X i )) For discrete variables, conditional probability table (CPT)= distribution over X i for each combination of parent values
5 Burglars and Earthquakes You are at a Done with the AI class party. Neighbor John calls to say your home alarm has gone off (but neighbor Mary doesn't). Sometimes your alarm is set off by minor earthquakes. Question: Is your home being burglarized? Variables: Burglary, Earthquake, Alarm, JohnCalls, MaryCalls Network topology reflects "causal" knowledge: A burglar can set the alarm off An earthquake can set the alarm off The alarm can cause Mary to call The alarm can cause John to call
6 Burglars and Earthquakes Pr(E=t) Pr(E=f) Earthquake Burglary Pr(B=t) Pr(B=f) Pr(JC A) a 0.9 (0.1) a 0.05 (0.95) Alarm Pr(A E,B) e,b 0.95 (0.05) e,b 0.29 (0.71) e,b 0.94 (0.06) e,b (0.999) JohnCalls MaryCalls Pr(MC A) a 0.7 (0.3) a 0.01 (0.99) D. Weld and D. Fox 6
7 Earthquake Example Earthquake Burglary (cont d) Alarm JohnCalls MaryCalls If we know Alarm, no other evidence influences our degree of belief in JohnCalls P(JC MC,A,E,B) = P(JC A) also: P(MC JC,A,E,B) = P(MC A) and P(E B) = P(E) By the chain rule we have P(JC,MC,A,E,B) = P(JC MC,A,E,B) P(MC A,E,B) P(A E,B) P(E B) P(B) = P(JC A) P(MC A) P(A B,E) P(E) P(B) Full joint requires only 10 parameters (cf. 32) D. Weld and D. Fox 7
8 Earthquake Example Earthquake Burglary (Global Semantics) Alarm JohnCalls MaryCalls We just proved P(JC,MC,A,E,B) = P(JC A) P(MC A) P(A B,E) P(E) P(B) In general full joint distribution of a Bayes net is defined as P( X 1, X 2,..., Xn) P( Xi Par ( Xi)) n i 1 D. Weld and D. Fox 8
9 BNs: Qualitative Structure Graphical structure of BN reflects conditional independence among variables Each variable X is a node in the DAG Edges denote direct probabilistic influence usually interpreted causally parents of X are denoted Par(X) Local semantics: X is conditionally independent of all nondescendents given its parents Graphical test exists for more general independence Markov Blanket D. Weld and D. Fox 9
10 Given Parents, X is Independent of Non-Descendants D. Weld and D. Fox 10
11 Examples Earthquake Burglary Alarm JohnCalls MaryCalls D. Weld and D. Fox 11
12 For Example Earthquake Burglary Alarm JohnCalls MaryCalls D. Weld and D. Fox 12
13 For Example Earthquake Burglary Radio Alarm JohnCalls MaryCalls D. Weld and D. Fox 13
14 For Example Earthquake Burglary Radio Alarm JohnCalls MaryCalls D. Weld and D. Fox 14
15 For Example Earthquake Burglary Radio Alarm JohnCalls MaryCalls D. Weld and D. Fox 15
16 Given Markov Blanket, X is Independent of All Other Nodes MB(X) = Par(X) Childs(X) Par(Childs(X)) D. Weld and D. Fox 16
17 For Example Earthquake Burglary Radio Alarm JohnCalls MaryCalls D. Weld and D. Fox 17
18 For Example Earthquake Burglary Radio Alarm JohnCalls MaryCalls D. Weld and D. Fox 18
19 Bayes Net Construction Example Suppose we choose the ordering M, J, A, B, E P(J M) = P(J)? D. Weld and D. Fox 19
20 Example Suppose we choose the ordering M, J, A, B, E P(J M) = P(J)? No P(A J, M) = P(A J)? P(A M)? P(A)? D. Weld and D. Fox 20
21 Example Suppose we choose the ordering M, J, A, B, E P(J M) = P(J)? No P(A J, M) = P(A J)? P(A J, M) = P(A)? No P(B A, J, M) = P(B A)? P(B A, J, M) = P(B)? D. Weld and D. Fox 21
22 Example Suppose we choose the ordering M, J, A, B, E P(J M) = P(J)? No P(A J, M) = P(A J)? P(A J, M) = P(A)? No P(B A, J, M) = P(B A)? Yes P(B A, J, M) = P(B)? No P(E B, A,J, M) = P(E A)? P(E B, A, J, M) = P(E A, B)? D. Weld and D. Fox 22
23 Example Suppose we choose the ordering M, J, A, B, E P(J M) = P(J)? No P(A J, M) = P(A J)? P(A J, M) = P(A)? No P(B A, J, M) = P(B A)? Yes P(B A, J, M) = P(B)? No P(E B, A,J, M) = P(E A)? No P(E B, A, J, M) = P(E A, B)? Yes D. Weld and D. Fox 23
24 Example contd. Deciding conditional independence is hard in noncausal directions (Causal models and conditional independence seem hardwired for humans!) Network is less compact: = 13 numbers needed D. Weld and D. Fox 24
25 Example: Car Diagnosis
26 Example: Car Insurance
27 Compact Conditionals
28 Compact Conditionals
29 Hybrid (discrete+cont) Networks
30 #1: Continuous Child Variables
31 #2 Discrete child cont. parents
32 Why probit?
33 Sigmoid Function
34 Inference in BNs The graphical independence representation yields efficient inference schemes We generally want to compute Marginal probability: Pr(Z), Pr(Z E) where E is (conjunctive) evidence Z: query variable(s), E: evidence variable(s) everything else: hidden variable Computations organized by network topology D. Weld and D. Fox 34
35 P(B J=true, M=true) Earthquake Burglary Alarm JohnCalls MaryCalls P(b j,m) = P(b,j,m,e,a) e,a D. Weld and D. Fox 35
36 P(B J=true, M=true) Earthquake Burglary Alarm JohnCalls MaryCalls P(b j,m) = P(b) P(e) P(a b,e)p(j a)p(m a) e a D. Weld and D. Fox 36
37 Variable Elimination P(b j,m) = P(b) P(e) P(a b,e)p(j a)p(m,a) e a Repeated computations Dynamic Programming D. Weld and D. Fox 37
38 Variable Elimination A factor is a function from some set of variables into a specific value: e.g., f(e,a,n1) CPTs are factors, e.g., P(A E,B) function of A,E,B VE works by eliminating all variables in turn until there is a factor with only query variable To eliminate a variable: join all factors containing that variable (like DB) sum out the influence of the variable on new factor exploits product form of joint distribution D. Weld and D. Fox 38
39 Example of VE: P(JC) P(J) = M,A,B,E P(J,M,A,B,E) = M,A,B,E P(J A)P(M A) P(B)P(A B,E)P(E) = A P(J A) M P(M A) B P(B) E P(A B,E)P(E) = A P(J A) M P(M A) B P(B) f1(a,b) = A P(J A) M P(M A) f2(a) = A P(J A) f3(a) = f4(j) Earthqk JC Alarm Burgl MC D. Weld and D. Fox 39
40 Notes on VE Each operation is a simple multiplication of factors and summing out a variable Complexity determined by size of largest factor in our example, 3 vars (not 5) linear in number of vars, exponential in largest factor elimination ordering greatly impacts factor size optimal elimination orderings: NP-hard heuristics, special structure (e.g., polytrees) Practically, inference is much more tractable using structure of this sort D. Weld and D. Fox 40
41 Earthquake Burglary Irrelevant variables Alarm P(J) = M,A,B,E P(J,M,A,B,E) = M,A,B,E P(J A)P(B)P(A B,E)P(E)P(M A) = A P(J A) B P(B) E P(A B,E)P(E) M P(M A) = A P(J A) B P(B) E P(A B,E)P(E) = A P(J A) B P(B) f1(a,b) = A P(J A) f2(a) JohnCalls MaryCalls = f3(j) M is irrelevant to the computation Thm: Y is irrelevant unless Y ϵ Ancestors(Z U E)
42 Reducing 3-SAT to Bayes Nets
43 Complexity of Exact Inference Exact inference is NP hard 3-SAT to Bayes Net Inference It can count no. of assignments for 3-SAT: #P complete Inference in tree-structured Bayesian network Polynomial time compare with inference in CSPs Approximate Inference Sampling based techniques
Bayesian Networks. Read R&N Ch. 14.1-14.2. Next lecture: Read R&N 18.1-18.4
Bayesian Networks Read R&N Ch. 14.1-14.2 Next lecture: Read R&N 18.1-18.4 You will be expected to know Basic concepts and vocabulary of Bayesian networks. Nodes represent random variables. Directed arcs
Relational Dynamic Bayesian Networks: a report. Cristina Manfredotti
Relational Dynamic Bayesian Networks: a report Cristina Manfredotti Dipartimento di Informatica, Sistemistica e Comunicazione (D.I.S.Co.) Università degli Studi Milano-Bicocca [email protected]
Life of A Knowledge Base (KB)
Life of A Knowledge Base (KB) A knowledge base system is a special kind of database management system to for knowledge base management. KB extraction: knowledge extraction using statistical models in NLP/ML
Decision Trees and Networks
Lecture 21: Uncertainty 6 Today s Lecture Victor R. Lesser CMPSCI 683 Fall 2010 Decision Trees and Networks Decision Trees A decision tree is an explicit representation of all the possible scenarios from
Compression algorithm for Bayesian network modeling of binary systems
Compression algorithm for Bayesian network modeling of binary systems I. Tien & A. Der Kiureghian University of California, Berkeley ABSTRACT: A Bayesian network (BN) is a useful tool for analyzing the
The Basics of Graphical Models
The Basics of Graphical Models David M. Blei Columbia University October 3, 2015 Introduction These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. Many figures
Probabilistic Networks An Introduction to Bayesian Networks and Influence Diagrams
Probabilistic Networks An Introduction to Bayesian Networks and Influence Diagrams Uffe B. Kjærulff Department of Computer Science Aalborg University Anders L. Madsen HUGIN Expert A/S 10 May 2005 2 Contents
Chapter 28. Bayesian Networks
Chapter 28. Bayesian Networks The Quest for Artificial Intelligence, Nilsson, N. J., 2009. Lecture Notes on Artificial Intelligence, Spring 2012 Summarized by Kim, Byoung-Hee and Lim, Byoung-Kwon Biointelligence
Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1
Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2011 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields
DETERMINING THE CONDITIONAL PROBABILITIES IN BAYESIAN NETWORKS
Hacettepe Journal of Mathematics and Statistics Volume 33 (2004), 69 76 DETERMINING THE CONDITIONAL PROBABILITIES IN BAYESIAN NETWORKS Hülya Olmuş and S. Oral Erbaş Received 22 : 07 : 2003 : Accepted 04
Bayesian networks - Time-series models - Apache Spark & Scala
Bayesian networks - Time-series models - Apache Spark & Scala Dr John Sandiford, CTO Bayes Server Data Science London Meetup - November 2014 1 Contents Introduction Bayesian networks Latent variables Anomaly
3. The Junction Tree Algorithms
A Short Course on Graphical Models 3. The Junction Tree Algorithms Mark Paskin [email protected] 1 Review: conditional independence Two random variables X and Y are independent (written X Y ) iff p X ( )
NP-complete? NP-hard? Some Foundations of Complexity. Prof. Sven Hartmann Clausthal University of Technology Department of Informatics
NP-complete? NP-hard? Some Foundations of Complexity Prof. Sven Hartmann Clausthal University of Technology Department of Informatics Tractability of Problems Some problems are undecidable: no computer
The Certainty-Factor Model
The Certainty-Factor Model David Heckerman Departments of Computer Science and Pathology University of Southern California HMR 204, 2025 Zonal Ave Los Angeles, CA 90033 [email protected] 1 Introduction
Course: Model, Learning, and Inference: Lecture 5
Course: Model, Learning, and Inference: Lecture 5 Alan Yuille Department of Statistics, UCLA Los Angeles, CA 90095 [email protected] Abstract Probability distributions on structured representation.
Informatics 2D Reasoning and Agents Semester 2, 2015-16
Informatics 2D Reasoning and Agents Semester 2, 2015-16 Alex Lascarides [email protected] Lecture 29 Decision Making Under Uncertainty 24th March 2016 Informatics UoE Informatics 2D 1 Where are we? Last
Big Data, Machine Learning, Causal Models
Big Data, Machine Learning, Causal Models Sargur N. Srihari University at Buffalo, The State University of New York USA Int. Conf. on Signal and Image Processing, Bangalore January 2014 1 Plan of Discussion
Why? A central concept in Computer Science. Algorithms are ubiquitous.
Analysis of Algorithms: A Brief Introduction Why? A central concept in Computer Science. Algorithms are ubiquitous. Using the Internet (sending email, transferring files, use of search engines, online
Satisfiability Checking
Satisfiability Checking SAT-Solving Prof. Dr. Erika Ábrahám Theory of Hybrid Systems Informatik 2 WS 10/11 Prof. Dr. Erika Ábrahám - Satisfiability Checking 1 / 40 A basic SAT algorithm Assume the CNF
Context-Specific Independence in Bayesian Networks. P(z [ x) for all values x, y and z of variables X, Y and
Context-Specific Independence in Bayesian Networks 115 Context-Specific Independence in Bayesian Networks Craig Boutilier Dept. of Computer Science University of British Columbia Vancouver, BC V6T 1Z4
10-601. Machine Learning. http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html
10-601 Machine Learning http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html Course data All up-to-date info is on the course web page: http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html
Message-passing sequential detection of multiple change points in networks
Message-passing sequential detection of multiple change points in networks Long Nguyen, Arash Amini Ram Rajagopal University of Michigan Stanford University ISIT, Boston, July 2012 Nguyen/Amini/Rajagopal
Bayesian Networks of Customer Satisfaction Survey Data
Bayesian Networks of Customer Satisfaction Survey Data Silvia Salini * University of Milan, Italy Ron S. Kenett KPA Ltd., Raanana, Israel and University of Torino, Torino, Italy Abstract: A Bayesian Network
Question 2 Naïve Bayes (16 points)
Question 2 Naïve Bayes (16 points) About 2/3 of your email is spam so you downloaded an open source spam filter based on word occurrences that uses the Naive Bayes classifier. Assume you collected the
An Introduction to the Use of Bayesian Network to Analyze Gene Expression Data
n Introduction to the Use of ayesian Network to nalyze Gene Expression Data Cristina Manfredotti Dipartimento di Informatica, Sistemistica e Comunicazione (D.I.S.Co. Università degli Studi Milano-icocca
A Bayesian Approach for on-line max auditing of Dynamic Statistical Databases
A Bayesian Approach for on-line max auditing of Dynamic Statistical Databases Gerardo Canfora Bice Cavallo University of Sannio, Benevento, Italy, {gerardo.canfora,bice.cavallo}@unisannio.it ABSTRACT In
Applied Algorithm Design Lecture 5
Applied Algorithm Design Lecture 5 Pietro Michiardi Eurecom Pietro Michiardi (Eurecom) Applied Algorithm Design Lecture 5 1 / 86 Approximation Algorithms Pietro Michiardi (Eurecom) Applied Algorithm Design
Complexity Theory. IE 661: Scheduling Theory Fall 2003 Satyaki Ghosh Dastidar
Complexity Theory IE 661: Scheduling Theory Fall 2003 Satyaki Ghosh Dastidar Outline Goals Computation of Problems Concepts and Definitions Complexity Classes and Problems Polynomial Time Reductions Examples
Outline. NP-completeness. When is a problem easy? When is a problem hard? Today. Euler Circuits
Outline NP-completeness Examples of Easy vs. Hard problems Euler circuit vs. Hamiltonian circuit Shortest Path vs. Longest Path 2-pairs sum vs. general Subset Sum Reducing one problem to another Clique
Model-based Synthesis. Tony O Hagan
Model-based Synthesis Tony O Hagan Stochastic models Synthesising evidence through a statistical model 2 Evidence Synthesis (Session 3), Helsinki, 28/10/11 Graphical modelling The kinds of models that
Bayesian Network Development
Bayesian Network Development Kenneth BACLAWSKI College of Computer Science, Northeastern University Boston, Massachusetts 02115 USA [email protected] Abstract Bayesian networks are a popular mechanism
STA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! [email protected]! http://www.cs.toronto.edu/~rsalakhu/ Lecture 6 Three Approaches to Classification Construct
Agenda. Interface Agents. Interface Agents
Agenda Marcelo G. Armentano Problem Overview Interface Agents Probabilistic approach Monitoring user actions Model of the application Model of user intentions Example Summary ISISTAN Research Institute
A Comparison of Novel and State-of-the-Art Polynomial Bayesian Network Learning Algorithms
A Comparison of Novel and State-of-the-Art Polynomial Bayesian Network Learning Algorithms Laura E. Brown Ioannis Tsamardinos Constantin F. Aliferis Vanderbilt University, Department of Biomedical Informatics
Large-Sample Learning of Bayesian Networks is NP-Hard
Journal of Machine Learning Research 5 (2004) 1287 1330 Submitted 3/04; Published 10/04 Large-Sample Learning of Bayesian Networks is NP-Hard David Maxwell Chickering David Heckerman Christopher Meek Microsoft
SYSM 6304: Risk and Decision Analysis Lecture 5: Methods of Risk Analysis
SYSM 6304: Risk and Decision Analysis Lecture 5: Methods of Risk Analysis M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: [email protected] October 17, 2015 Outline
A Logical Approach to Factoring Belief Networks
A Logical Approach to Factoring Belief Networks Adnan Darwiche Computer Science Department University of California, Los Angeles Los Angeles, CA 90095 [email protected] Abstract We have recently proposed
I I I I I I I I I I I I I I I I I I I
A ABSTRACT Method for Using Belief Networks as nfluence Diagrams Gregory F. Cooper Medical Computer Science Group Medical School Office Building Stanford University Stanford, CA 94305 This paper demonstrates
NP-Completeness. CptS 223 Advanced Data Structures. Larry Holder School of Electrical Engineering and Computer Science Washington State University
NP-Completeness CptS 223 Advanced Data Structures Larry Holder School of Electrical Engineering and Computer Science Washington State University 1 Hard Graph Problems Hard means no known solutions with
Page 1. CSCE 310J Data Structures & Algorithms. CSCE 310J Data Structures & Algorithms. P, NP, and NP-Complete. Polynomial-Time Algorithms
CSCE 310J Data Structures & Algorithms P, NP, and NP-Complete Dr. Steve Goddard [email protected] CSCE 310J Data Structures & Algorithms Giving credit where credit is due:» Most of the lecture notes
A Bayesian Network Model for Diagnosis of Liver Disorders Agnieszka Onisko, M.S., 1,2 Marek J. Druzdzel, Ph.D., 1 and Hanna Wasyluk, M.D.,Ph.D.
Research Report CBMI-99-27, Center for Biomedical Informatics, University of Pittsburgh, September 1999 A Bayesian Network Model for Diagnosis of Liver Disorders Agnieszka Onisko, M.S., 1,2 Marek J. Druzdzel,
Computer Algorithms. NP-Complete Problems. CISC 4080 Yanjun Li
Computer Algorithms NP-Complete Problems NP-completeness The quest for efficient algorithms is about finding clever ways to bypass the process of exhaustive search, using clues from the input in order
Similarity Search and Mining in Uncertain Spatial and Spatio Temporal Databases. Andreas Züfle
Similarity Search and Mining in Uncertain Spatial and Spatio Temporal Databases Andreas Züfle Geo Spatial Data Huge flood of geo spatial data Modern technology New user mentality Great research potential
8. Machine Learning Applied Artificial Intelligence
8. Machine Learning Applied Artificial Intelligence Prof. Dr. Bernhard Humm Faculty of Computer Science Hochschule Darmstadt University of Applied Sciences 1 Retrospective Natural Language Processing Name
Cost Model: Work, Span and Parallelism. 1 The RAM model for sequential computation:
CSE341T 08/31/2015 Lecture 3 Cost Model: Work, Span and Parallelism In this lecture, we will look at how one analyze a parallel program written using Cilk Plus. When we analyze the cost of an algorithm
15-381: Artificial Intelligence. Probabilistic Reasoning and Inference
5-38: Artificial Intelligence robabilistic Reasoning and Inference Advantages of probabilistic reasoning Appropriate for complex, uncertain, environments - Will it rain tomorrow? Applies naturally to many
SIMS 255 Foundations of Software Design. Complexity and NP-completeness
SIMS 255 Foundations of Software Design Complexity and NP-completeness Matt Welsh November 29, 2001 [email protected] 1 Outline Complexity of algorithms Space and time complexity ``Big O'' notation Complexity
! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. !-approximation algorithm.
Approximation Algorithms Chapter Approximation Algorithms Q Suppose I need to solve an NP-hard problem What should I do? A Theory says you're unlikely to find a poly-time algorithm Must sacrifice one of
Study Manual. Probabilistic Reasoning. Silja Renooij. August 2015
Study Manual Probabilistic Reasoning 2015 2016 Silja Renooij August 2015 General information This study manual was designed to help guide your self studies. As such, it does not include material that is
Introduction to Logic in Computer Science: Autumn 2006
Introduction to Logic in Computer Science: Autumn 2006 Ulle Endriss Institute for Logic, Language and Computation University of Amsterdam Ulle Endriss 1 Plan for Today Now that we have a basic understanding
Approximating the Partition Function by Deleting and then Correcting for Model Edges
Approximating the Partition Function by Deleting and then Correcting for Model Edges Arthur Choi and Adnan Darwiche Computer Science Department University of California, Los Angeles Los Angeles, CA 995
Feedforward Neural Networks and Backpropagation
Feedforward Neural Networks and Backpropagation Feedforward neural networks Architectural issues, computational capabilities Sigmoidal and radial basis functions Gradient-based learning and Backprogation
Mathematical models to estimate the quality of monitoring software systems for electrical substations
Mathematical models to estimate the quality of monitoring software systems for electrical substations MIHAIELA ILIESCU 1, VICTOR URSIANU 2, FLORICA MOLDOVEANU 2, RADU URSIANU 2, EMILIANA URSIANU 3 1 Faculty
CSC 373: Algorithm Design and Analysis Lecture 16
CSC 373: Algorithm Design and Analysis Lecture 16 Allan Borodin February 25, 2013 Some materials are from Stephen Cook s IIT talk and Keven Wayne s slides. 1 / 17 Announcements and Outline Announcements
Introduction to Scheduling Theory
Introduction to Scheduling Theory Arnaud Legrand Laboratoire Informatique et Distribution IMAG CNRS, France [email protected] November 8, 2004 1/ 26 Outline 1 Task graphs from outer space 2 Scheduling
Knowledge-Based Probabilistic Reasoning from Expert Systems to Graphical Models
Knowledge-Based Probabilistic Reasoning from Expert Systems to Graphical Models By George F. Luger & Chayan Chakrabarti {luger cc} @cs.unm.edu Department of Computer Science University of New Mexico Albuquerque
ANALYTICS IN BIG DATA ERA
ANALYTICS IN BIG DATA ERA ANALYTICS TECHNOLOGY AND ARCHITECTURE TO MANAGE VELOCITY AND VARIETY, DISCOVER RELATIONSHIPS AND CLASSIFY HUGE AMOUNT OF DATA MAURIZIO SALUSTI SAS Copyr i g ht 2012, SAS Ins titut
Tutorial 8. NP-Complete Problems
Tutorial 8 NP-Complete Problems Decision Problem Statement of a decision problem Part 1: instance description defining the input Part 2: question stating the actual yesor-no question A decision problem
Lecture 7: NP-Complete Problems
IAS/PCMI Summer Session 2000 Clay Mathematics Undergraduate Program Basic Course on Computational Complexity Lecture 7: NP-Complete Problems David Mix Barrington and Alexis Maciel July 25, 2000 1. Circuit
Flexible Neural Trees Ensemble for Stock Index Modeling
Flexible Neural Trees Ensemble for Stock Index Modeling Yuehui Chen 1, Ju Yang 1, Bo Yang 1 and Ajith Abraham 2 1 School of Information Science and Engineering Jinan University, Jinan 250022, P.R.China
! E6893 Big Data Analytics Lecture 11:! Linked Big Data Graphical Models (II)
E6893 Big Data Analytics Lecture 11: Linked Big Data Graphical Models (II) Ching-Yung Lin, Ph.D. Adjunct Professor, Dept. of Electrical Engineering and Computer Science Mgr., Dept. of Network Science and
GeNIeRate: An Interactive Generator of Diagnostic Bayesian Network Models
GeNIeRate: An Interactive Generator of Diagnostic Bayesian Network Models Pieter C. Kraaijeveld Man Machine Interaction Group Delft University of Technology Mekelweg 4, 2628 CD Delft, the Netherlands [email protected]
! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. #-approximation algorithm.
Approximation Algorithms 11 Approximation Algorithms Q Suppose I need to solve an NP-hard problem What should I do? A Theory says you're unlikely to find a poly-time algorithm Must sacrifice one of three
Chapter 11. 11.1 Load Balancing. Approximation Algorithms. Load Balancing. Load Balancing on 2 Machines. Load Balancing: Greedy Scheduling
Approximation Algorithms Chapter Approximation Algorithms Q. Suppose I need to solve an NP-hard problem. What should I do? A. Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one
Compact Representations and Approximations for Compuation in Games
Compact Representations and Approximations for Compuation in Games Kevin Swersky April 23, 2008 Abstract Compact representations have recently been developed as a way of both encoding the strategic interactions
A Probabilistic Causal Model for Diagnosis of Liver Disorders
Intelligent Information Systems VII Proceedings of the Workshop held in Malbork, Poland, June 15-19, 1998 A Probabilistic Causal Model for Diagnosis of Liver Disorders Agnieszka Oniśko 1, Marek J. Druzdzel
Markov random fields and Gibbs measures
Chapter Markov random fields and Gibbs measures 1. Conditional independence Suppose X i is a random element of (X i, B i ), for i = 1, 2, 3, with all X i defined on the same probability space (.F, P).
Chapter 14 Managing Operational Risks with Bayesian Networks
Chapter 14 Managing Operational Risks with Bayesian Networks Carol Alexander This chapter introduces Bayesian belief and decision networks as quantitative management tools for operational risks. Bayesian
Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay
Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding
M/M/1 and M/M/m Queueing Systems
M/M/ and M/M/m Queueing Systems M. Veeraraghavan; March 20, 2004. Preliminaries. Kendall s notation: G/G/n/k queue G: General - can be any distribution. First letter: Arrival process; M: memoryless - exponential
MAS108 Probability I
1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper
Introduction to Algorithms. Part 3: P, NP Hard Problems
Introduction to Algorithms Part 3: P, NP Hard Problems 1) Polynomial Time: P and NP 2) NP-Completeness 3) Dealing with Hard Problems 4) Lower Bounds 5) Books c Wayne Goddard, Clemson University, 2004 Chapter
Bayesian Tutorial (Sheet Updated 20 March)
Bayesian Tutorial (Sheet Updated 20 March) Practice Questions (for discussing in Class) Week starting 21 March 2016 1. What is the probability that the total of two dice will be greater than 8, given that
Programming Tools based on Big Data and Conditional Random Fields
Programming Tools based on Big Data and Conditional Random Fields Veselin Raychev Martin Vechev Andreas Krause Department of Computer Science ETH Zurich Zurich Machine Learning and Data Science Meet-up,
Exponential Random Graph Models for Social Network Analysis. Danny Wyatt 590AI March 6, 2009
Exponential Random Graph Models for Social Network Analysis Danny Wyatt 590AI March 6, 2009 Traditional Social Network Analysis Covered by Eytan Traditional SNA uses descriptive statistics Path lengths
SEQUENTIAL VALUATION NETWORKS: A NEW GRAPHICAL TECHNIQUE FOR ASYMMETRIC DECISION PROBLEMS
SEQUENTIAL VALUATION NETWORKS: A NEW GRAPHICAL TECHNIQUE FOR ASYMMETRIC DECISION PROBLEMS Riza Demirer and Prakash P. Shenoy University of Kansas, School of Business 1300 Sunnyside Ave, Summerfield Hall
Cell Phone based Activity Detection using Markov Logic Network
Cell Phone based Activity Detection using Markov Logic Network Somdeb Sarkhel [email protected] 1 Introduction Mobile devices are becoming increasingly sophisticated and the latest generation of smart
Learning Instance-Specific Predictive Models
Journal of Machine Learning Research 11 (2010) 3333-3369 Submitted 3/09; Revised 7/10; Published 12/10 Learning Instance-Specific Predictive Models Shyam Visweswaran Gregory F. Cooper Department of Biomedical
Bayesian Network Modeling of Offender Behavior for Criminal Profiling
Proceedings of the 44th EEE Conference on Decision and Control, and the European Control Conference 2005 Seville, Spain, December 12-15, 2005 TuA19.3 Bayesian Network Modeling of ffender Behavior for Criminal
Estimating Software Reliability In the Absence of Data
Estimating Software Reliability In the Absence of Data Joanne Bechta Dugan ([email protected]) Ganesh J. Pai ([email protected]) Department of ECE University of Virginia, Charlottesville, VA NASA OSMA SAS
Bayesian Networks and Classifiers in Project Management
Bayesian Networks and Classifiers in Project Management Daniel Rodríguez 1, Javier Dolado 2 and Manoranjan Satpathy 1 1 Dept. of Computer Science The University of Reading Reading, RG6 6AY, UK [email protected],
BUILDING KNOWLEDGE-BASED SYSTEMS
BUILDING KNOWLEDGE-BASED SYSTEMS BY CREDAL NETWORKS: A TUTORIAL Alberto Piatti 1, Alessandro Antonucci 2 and Marco Zaffalon 2 1 Mathematical and Physical Sciences Unit (SMF), University of Applied Sciences
Using Bayesian Networks to Analyze Expression Data ABSTRACT
JOURNAL OF COMPUTATIONAL BIOLOGY Volume 7, Numbers 3/4, 2 Mary Ann Liebert, Inc. Pp. 6 62 Using Bayesian Networks to Analyze Expression Data NIR FRIEDMAN, MICHAL LINIAL, 2 IFTACH NACHMAN, 3 and DANA PE
Finding the M Most Probable Configurations Using Loopy Belief Propagation
Finding the M Most Probable Configurations Using Loopy Belief Propagation Chen Yanover and Yair Weiss School of Computer Science and Engineering The Hebrew University of Jerusalem 91904 Jerusalem, Israel
Attribution. Modified from Stuart Russell s slides (Berkeley) Parts of the slides are inspired by Dan Klein s lecture material for CS 188 (Berkeley)
Machine Learning 1 Attribution Modified from Stuart Russell s slides (Berkeley) Parts of the slides are inspired by Dan Klein s lecture material for CS 188 (Berkeley) 2 Outline Inductive learning Decision
Data Mining Practical Machine Learning Tools and Techniques
Ensemble learning Data Mining Practical Machine Learning Tools and Techniques Slides for Chapter 8 of Data Mining by I. H. Witten, E. Frank and M. A. Hall Combining multiple models Bagging The basic idea
Structured Learning and Prediction in Computer Vision. Contents
Foundations and Trends R in Computer Graphics and Vision Vol. 6, Nos. 3 4 (2010) 185 365 c 2011 S. Nowozin and C. H. Lampert DOI: 10.1561/0600000033 Structured Learning and Prediction in Computer Vision
CORRELATED TO THE SOUTH CAROLINA COLLEGE AND CAREER-READY FOUNDATIONS IN ALGEBRA
We Can Early Learning Curriculum PreK Grades 8 12 INSIDE ALGEBRA, GRADES 8 12 CORRELATED TO THE SOUTH CAROLINA COLLEGE AND CAREER-READY FOUNDATIONS IN ALGEBRA April 2016 www.voyagersopris.com Mathematical
Decision support software for probabilistic risk assessment using Bayesian networks
Decision support software for probabilistic risk assessment using Bayesian networks Norman Fenton and Martin Neil THIS IS THE AUTHOR S POST-PRINT VERSION OF THE FOLLOWING CITATION (COPYRIGHT IEEE) Fenton,
