Warm-up. Probability in the movies
|
|
- Alberta Fox
- 7 years ago
- Views:
Transcription
1 Warm-up Probability in the movies
2 Announcements Upcoming due dates Wednesday 10/14 11:59pm Homework 5 Friday 10/16 5:00 pm Project 2 Probability sessions Continuing for the next couple of weeks, including tomorrow Wednesdays, 9-10 am in 299 Cory and 6-7 pm in 531 Cory Contest Results on Thursday! Midterm Results Regrades due Monday, 11:59 pm Response to course survey
3 Course Feedback Response In general, fairly positive Project (Fri), Written HW (Mon), Midterm (Thur) Pushing homework due dates to Wednesdays Note: edx homeworks due Wednesday before project due Friday Written homework not clear about answer format We can fix that Office hours are crowded The homework deadline change should help some Will try to schedule homework parties (likely in evenings) We ll look for bigger rooms too (kinda hard sometimes) Please contact us if you feel you are not getting the help you need via standard mechanisms Piazza response rate mixed reviews We will certainly work to do better Keep up the great student responses
4 Course Feedback Response Grade clarity Added details to course policies tab on edx Stuart and Pat should wear brighter colors (multiple comments) Stuart and I are going shopping this weekend
5 Where we are in the course We re done with Part I Search and Planning! Part II: Probabilistic Reasoning Diagnosis Speech recognition Tracking objects Robot mapping Genetics Error correcting codes lots more! Part III: Machine Learning
6 CS 188: Artificial Intelligence Probability Instructors: Stuart Russell and Pat Virtue University of California, Berkeley
7 Today Probability Random Variables Joint and Marginal Distributions Conditional Distribution Product Rule, Chain Rule, Bayes Rule Inference All of these concepts were in CS 70, but now you ll be using them as basic elements in a practical technology for reasoning
8 Uncertainty Wheel of Fortune
9 Uncertainty My flight to Chicago is scheduled to leave SFO at 10:30 am Let action A t = order leave home t minutes before flight Will A t ensure that I catch the plane? Problems: partial observability (road state, other drivers plans, etc.) noisy sensors (radio traffic reports, Google maps) immense complexity of modelling and predicting traffic, security line, etc. lack of knowledge of world dynamics (will tire burst? Will that guy in the security line remember to take out his keys?)
10 Responses to uncertainty Ignore it map directly from percept stream (known) to actions Hopeless! Some sort of softening of logical rules (fudge factors) A CatchPlane CatchPlane 0.95 MajorTrafficJam Hence, chaining these together, A Oops Probability MajorTrafficJam Given the available evidence and the choice A 120, I will catch the plane with probability 0.92
11 Probability Probabilistic assertions summarize effects of ignorance: lack of relevant facts, initial conditions, etc. laziness: failure to list all exceptions, compute detailed predictions, etc. Subjective or Bayesian probability: Probabilities relate propositions to one s own state of knowledge E.g., P(CatchPlane A 120, sunny) = 0.92 Not claiming a probabilistic tendency in the actual situation (traffic is not like quantum mechanics) Probabilities of propositions change with new evidence: E.g., P(CatchPlane A 120, sunny, NoDelaysOnBridge) = 0.96
12 Decisions Suppose I believe P(CatchPlane A 60, all my evidence ) = 0.51 P(CatchPlane A 120, all my evidence ) = 0.97 P(CatchPlane A 1440, all my evidence ) = Which action should I choose? Depends on my preferences for, e.g., missing flight, airport food, etc. Utility theory is used to represent and infer preferences Decision theory = utility theory + probability theory Maximize expected utility : a* = argmax a s P(s a) U(s)
13 Inference in Ghostbusters A ghost is in the grid somewhere Sensor readings tell how close a square is to the ghost On the ghost: red 1 or 2 away: orange 3 or 4 away: yellow 5+ away: green Sensors are noisy, but we know P(Color(x,y) DistanceFromGhost(x,y)) P(red 3) P(orange 3) P(yellow 3) P(green 3)
14 Video of Demo Ghostbusters No probability
15 Basic laws of probability Begin with a set of possible worlds E.g., 6 possible rolls of a die, {1, 2, 3, 4, 5, 6} A probability model assigns a number P( ) to each world E.g., P(1) = P(2) = P(3) = P(5) = P(5) = P(6) = 1/6. These numbers must satisfy 0 P( ) 1 P( ) = 1 1/6 1/6 1/6 1/6 1/6 1/6
16 Basic laws contd. An event is any subset of E.g., roll < 4 is the set {1,2,3} E.g., roll is odd is the set {1,3,5} The probability of an event is the sum of probabilities over its worlds P(A) = A P( ) E.g., P(roll < 4) = P(1) + P(2) + P(3) = 1/2 De Finetti (1931): anyone who bets according to probabilities that violate these laws can be forced to lose money on every set of bets
17 Random Variables A random variable is some aspect of the world (formally a deterministic function of ) about which we (may) be uncertain R = Is it raining? Odd = Is the dice roll an odd number? T = Is it hot or cold? D = How long will it take to get to the airport? L Ghost = Where is the ghost? We denote random variables with capital letters Like variables in a CSP, random variables have domains Odd in {true, false} e.g. Odd(1)=true, Odd(6) = false often write the event Odd=true as odd, Odd=false as odd T in {hot, cold} D in [0, ) L in possible locations, maybe {(0,0), (0,1), }
18 Probability Distributions Associate a probability with each value; sums to 1 Temperature: Weather: P(T) T P hot 0.5 cold 0.5 P(W) W P sun 0.6 rain 0.1 fog 0.3 meteor 0.0
19 Probability Distributions Every probability model automatically fixes a distribution for every random variable P(T) P(W) T P W P hot 0.5 sun 0.6 cold 0.5 rain 0.1 fog 0.3 meteor 0.0 A distribution is a TABLE of probabilities of values A probability (lower case value) is a single number P(W=rain) = 0.1 Shorthand notation: P(hot) = P(T=hot) P(cold) = P(T=cold) P(rain) = P(W=rain) OK if all domain entries are unique Special Boolean notation: P(happy) = P(Happy=true) P( happy) = P(Happy=false)
20 Joint Distributions A joint distribution over a set of random variables: X 1, X 2,, X n specifies a real number for each assignment (or outcome): P(X 1 =x 1, X 2 =x 2,, X n =x n ) P(x 1, x 2,, x n ) Joint distribution obeys same rules as univariate: P(x 1, x 2,, x n ) 0 x 1, x2,, xn P(x 1, x 2,, x n ) =1 P(T,W) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3
21 Joint Distributions Conceptual transition from single events to discrete distributions P(a, b) P(A,B) P(A,B) A B P(A,B) a b 0.1 a b 0.2 a b 0.2 a b 0.5
22 Making possible worlds In many cases we begin with random variables and their domains construct possible worlds as assignments of values to all variables (cf CSPs) E.g., two dice rolls Roll 1 and Roll 2 How many possible worlds? What are their probabilities? Size of distribution for n variables with domain size d? d n For all but the smallest distributions, cannot write out by hand!
23 Probabilities of events Recall that the probability of an event is the sum of probabilities of its worlds So, given a joint distribution over all variables, can compute any event probability! Probability that it s hot AND sunny? Probability that it s hot? Probability that it s hot OR sunny? Typically, the events we care about are partial assignments, like P(T=hot) P(T,W) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3
24 Quiz: Events P(X=true, Y=true)? P(X=true)? P(X Y)? P(X,Y) X Y P true true 0.2 true false 0.3 false true 0.4 false false 0.1
25 Quick Break So you're telling me there's a chance
26 Marginal Distributions Marginal distributions are sub-tables which eliminate variables Marginalization (summing out): Combine collapsed rows by adding P(T) P(T,W) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 P(X=x 1 ) = y P(X=x 1, Y=y ) T P hot 0.5 cold 0.5 P(W) W P sun 0.6 rain 0.4
27 Quiz: Marginal Distributions P(X) P(X,Y) X Y P true true 0.2 true false 0.3 P(x) = y P(x, y ) X true false P(Y) P false true 0.4 false false 0.1 P(y) = x P(x, y) Y true false P
28 Conditional Probabilities A simple relation between joint and conditional probabilities In fact, this is taken as the definition of a conditional probability P(a b) = P(a, b) P(b) P(a,b) P(T,W) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 P(a) P(b) P(W=s T=c) = P(W=s,T=c) P(T=c) = P(W=s,T=c) + P(W=r,T=c) = = 0.5 = 0.2/0.5 = 0.4
29 Conditional Probabilities Conceptual transition from single events to discrete distributions P(a b) P(A b) P(A B) P(A b) A B P(A B) a b 0.3 a b 0.7 P(A B) A B P(A B) a b 0.3 a b 0.7 a b 0.4 a b 0.6
30 Quiz: Conditional Probabilities P(X=true Y=true)? P(X,Y) X Y P true true 0.2 true false 0.3 false true 0.4 false false 0.1 P(X=false Y=true)? P(Y X=true)?
31 P(W T ) Conditional Distributions Conditional distributions are probability distributions over some variables given fixed values of others Conditional Distributions P(W T=hot) W P sun 0.8 rain 0.2 P(W T=cold) W P sun 0.4 rain 0.6 Joint Distribution P(T,W) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3
32 Normalization Trick P(W=s T=c) = P(W=s,T=c) P(T=c) P(T,W) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 P(W=r T=c) = P(W=s,T=c). P(W=s,T=c) + P(W=r,T=c) 0.2. = = = = P(W=r,T=c) P(T=c) P(W=r,T=c). P(W=s,T=c) + P(W=r,T=c) P(W T=c) W P sun 0.4 rain = =
33 Normalization Trick P(T,W) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 P(W=s T=c) SELECT the joint probabilities matching the evidence = = P(W=s,T=c) P(T=c) P(W=s,T=c). P(W=s,T=c) + P(W=r,T=c) 0.2. = = P(c,W) T W P cold sun 0.2 cold rain 0.3 NORMALIZE the selection (make it sum to one) P(W T=c) W P sun 0.4 rain 0.6 P(W=r T=c) P(W=r,T=c) = P(T=c) P(W=r,T=c). = P(W=s,T=c) + P(W=r,T=c) 0.3. = =
34 Normalization Trick P(T,W) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 SELECT the joint probabilities matching the evidence P(c,W) T W P cold sun 0.2 cold rain 0.3 NORMALIZE the selection (make it sum to one) P(W T=c) W P sun 0.4 rain 0.6 Why does this work? Sum of selection is P(evidence)! (P(T=cold), here)
35 Smaller set of remaining worlds Normalization
36 To Normalize (Dictionary) To bring or restore to a normal condition Procedure: All entries sum to ONE Step 1: Compute Z = sum over all entries [Note: AIMA uses = 1/Z] Step 2: Divide every entry by Z Example 1 Example 2 W P sun 0.2 Normalize rain 0.3 Z = 0.5 W P sun 0.4 rain 0.6 T W P hot sun 20 hot rain 5 cold sun 10 Normalize Z = 50 T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 15 cold rain 0.3
37 Probabilistic Inference Probabilistic inference: compute a desired probability from other known probabilities (e.g. conditional from joint) We generally compute conditional probabilities P(on time no reported accidents) = 0.90 These represent the agent s beliefs given the evidence Probabilities change with new evidence: P(on time no accidents, 5 a.m.) = 0.95 P(on time no accidents, 5 a.m., raining) = 0.80 Observing new evidence causes beliefs to be updated
38 Inference by Enumeration General case: Evidence variables: E 1,, E k = e 1,,e k Query* variable: Q Hidden variables: H 1,, H r X 1,, X n All variables We want: P(Q e 1,,e k ) * Works fine with multiple query variables, too Step 1: Select the entries consistent with the evidence Step 2: Sum out H to get joint of Query and evidence Step 3: Normalize Z = q P(q,e 1,,e k ) P(Q e 1,,e k ) = 1/Z P(Q,e 1,,e k ) P(Q,e 1,,e k ) = h 1,, hr P(Q,h 1,, h r, e 1,,e k ) X 1,, X n
39 Inference by Enumeration P(W)? S T W P summe r summe r summe r summe r hot sun 0.30 hot rain 0.05 cold sun 0.10 cold rain 0.05 winter hot sun 0.10 winter hot rain 0.05 winter cold sun 0.15 winter cold rain 0.20
40 Inference by Enumeration P(W)? P(W winter)? S T W P summe r summe r summe r summe r hot sun 0.30 hot rain 0.05 cold sun 0.10 cold rain 0.05 winter hot sun 0.10 winter hot rain 0.05 winter cold sun 0.15 winter cold rain 0.20
41 Inference by Enumeration P(W)? P(W winter)? P(W winter, hot)? S T W P summe r summe r summe r summe r hot sun 0.30 hot rain 0.05 cold sun 0.10 cold rain 0.05 winter hot sun 0.10 winter hot rain 0.05 winter cold sun 0.15 winter cold rain 0.20
42 Inference by Enumeration Obvious problems: Worst-case time complexity O(d n ) Space complexity O(d n ) to store the joint distribution
43 The Product Rule Sometimes have conditional distributions but want the joint P(a b) P(b) = P(a, b) P(a b) = P(a, b) P(b)
44 The Product Rule P(a b) P(b) = P(a, b) Example: P(D W) P(W) = P(D, W) P(D W) P(D, W) D W P wet sun 0.1 dry sun 0.9 wet rain 0.7 dry rain 0.3 P(W) W P sun 0.8 rain 0.2 D W P wet sun 0.08 dry sun 0.72 wet rain 0.14 dry rain 0.06
45 The Chain Rule More generally, can always write any joint distribution as an incremental product of conditional distributions P(x 1, x 2, x 3 ) = P(x 3 x 1, x 2 ) P(x 1, x 2 ) = P(x 3 x 1, x 2 ) P(x 2 x 1 ) P(x 1 ) P(x 1, x 2,, x n ) = i P(x i x 1,, x i-1 ) Why is this always true?
46 Bayes Rule
47 Bayes Rule Write the product rule both ways: P(a b) P(b) = P(a, b) = P(b a) P(a) That s my rule! Dividing left and right expressions, we get: P(a b) = Why is this at all helpful? P(b a) P(a) P(b) Lets us build one conditional from its reverse Often one conditional is tricky but the other one is simple Describes an update step from prior P(a) to posterior P(a b) Foundation of many systems we ll see later (e.g. ASR, MT) In the running for most important AI equation!
48 Bayes Rule 48
49 Inference with Bayes Rule Example: Diagnostic probability from causal probability: P(cause effect) = P(effect cause) P(cause) P(effect) Example: M: meningitis, S: stiff neck P(m) = P(s m) = 0.8 P(s) = 0.01 Example givens P(m s) = P(s m) P(m) P(s) = 0.8 x Note: posterior probability of meningitis still very small: (80x bigger why?) Note: you should still get stiff necks checked out! Why?
50 Next time Independence Conditional independence Bayes nets
15-381: Artificial Intelligence. Probabilistic Reasoning and Inference
5-38: Artificial Intelligence robabilistic Reasoning and Inference Advantages of probabilistic reasoning Appropriate for complex, uncertain, environments - Will it rain tomorrow? Applies naturally to many
More information10-601. Machine Learning. http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html
10-601 Machine Learning http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html Course data All up-to-date info is on the course web page: http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html
More informationI look forward to doing business with you and hope we get the chance to meet soon
Emailing 1: Worksheet 1 - Reading Tasks Dear Mr Smith, I would like to introduce myself. My name is Saarland and I am the new South Western sales manager for Chou Cream English Schools. The previous sales
More informationBayesian Tutorial (Sheet Updated 20 March)
Bayesian Tutorial (Sheet Updated 20 March) Practice Questions (for discussing in Class) Week starting 21 March 2016 1. What is the probability that the total of two dice will be greater than 8, given that
More informationActivities/ Resources for Unit V: Proportions, Ratios, Probability, Mean and Median
Activities/ Resources for Unit V: Proportions, Ratios, Probability, Mean and Median 58 What is a Ratio? A ratio is a comparison of two numbers. We generally separate the two numbers in the ratio with a
More informationBayesian Networks. Read R&N Ch. 14.1-14.2. Next lecture: Read R&N 18.1-18.4
Bayesian Networks Read R&N Ch. 14.1-14.2 Next lecture: Read R&N 18.1-18.4 You will be expected to know Basic concepts and vocabulary of Bayesian networks. Nodes represent random variables. Directed arcs
More informationThe Basics of Graphical Models
The Basics of Graphical Models David M. Blei Columbia University October 3, 2015 Introduction These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. Many figures
More informationQuestion: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?
ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the
More informationCHAPTER 2 Estimating Probabilities
CHAPTER 2 Estimating Probabilities Machine Learning Copyright c 2016. Tom M. Mitchell. All rights reserved. *DRAFT OF January 24, 2016* *PLEASE DO NOT DISTRIBUTE WITHOUT AUTHOR S PERMISSION* This is a
More informationProbability for AI students who have seen some probability before but didn t understand any of it
Probability for AI students who have seen some probability before but didn t understand any of it I am inflicting these proofs on you for two reasons: 1. These kind of manipulations will need to be second
More informationMA 1125 Lecture 14 - Expected Values. Friday, February 28, 2014. Objectives: Introduce expected values.
MA 5 Lecture 4 - Expected Values Friday, February 2, 24. Objectives: Introduce expected values.. Means, Variances, and Standard Deviations of Probability Distributions Two classes ago, we computed the
More informationElementary Statistics and Inference. Elementary Statistics and Inference. 16 The Law of Averages (cont.) 22S:025 or 7P:025.
Elementary Statistics and Inference 22S:025 or 7P:025 Lecture 20 1 Elementary Statistics and Inference 22S:025 or 7P:025 Chapter 16 (cont.) 2 D. Making a Box Model Key Questions regarding box What numbers
More informationProbability & Probability Distributions
Probability & Probability Distributions Carolyn J. Anderson EdPsych 580 Fall 2005 Probability & Probability Distributions p. 1/61 Probability & Probability Distributions Elementary Probability Theory Definitions
More informationSTA 371G: Statistics and Modeling
STA 371G: Statistics and Modeling Decision Making Under Uncertainty: Probability, Betting Odds and Bayes Theorem Mingyuan Zhou McCombs School of Business The University of Texas at Austin http://mingyuanzhou.github.io/sta371g
More informationProbability Models.S1 Introduction to Probability
Probability Models.S1 Introduction to Probability Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard The stochastic chapters of this book involve random variability. Decisions are
More informationA Few Basics of Probability
A Few Basics of Probability Philosophy 57 Spring, 2004 1 Introduction This handout distinguishes between inductive and deductive logic, and then introduces probability, a concept essential to the study
More informationCh. 13.2: Mathematical Expectation
Ch. 13.2: Mathematical Expectation Random Variables Very often, we are interested in sample spaces in which the outcomes are distinct real numbers. For example, in the experiment of rolling two dice, we
More informationData Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1
Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2011 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields
More informationLecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University
Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University 1 Chapter 1 Probability 1.1 Basic Concepts In the study of statistics, we consider experiments
More informationRandom variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.
Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()
More information2. How many ways can the letters in PHOENIX be rearranged? 7! = 5,040 ways.
Math 142 September 27, 2011 1. How many ways can 9 people be arranged in order? 9! = 362,880 ways 2. How many ways can the letters in PHOENIX be rearranged? 7! = 5,040 ways. 3. The letters in MATH are
More informationHow To Know When A Roulette Wheel Is Random
226 Part IV Randomness and Probability Chapter 14 From Randomness to Probability 1. Roulette. If a roulette wheel is to be considered truly random, then each outcome is equally likely to occur, and knowing
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 2
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 2 Proofs Intuitively, the concept of proof should already be familiar We all like to assert things, and few of us
More information6th Grade Lesson Plan: Probably Probability
6th Grade Lesson Plan: Probably Probability Overview This series of lessons was designed to meet the needs of gifted children for extension beyond the standard curriculum with the greatest ease of use
More informationBusiness Statistics 41000: Probability 1
Business Statistics 41000: Probability 1 Drew D. Creal University of Chicago, Booth School of Business Week 3: January 24 and 25, 2014 1 Class information Drew D. Creal Email: dcreal@chicagobooth.edu Office:
More informationProbability definitions
Probability definitions 1. Probability of an event = chance that the event will occur. 2. Experiment = any action or process that generates observations. In some contexts, we speak of a data-generating
More informationMath/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability
Math/Stats 425 Introduction to Probability 1. Uncertainty and the axioms of probability Processes in the real world are random if outcomes cannot be predicted with certainty. Example: coin tossing, stock
More informationExperimental Uncertainty and Probability
02/04/07 PHY310: Statistical Data Analysis 1 PHY310: Lecture 03 Experimental Uncertainty and Probability Road Map The meaning of experimental uncertainty The fundamental concepts of probability 02/04/07
More informationChapter 7 Probability. Example of a random circumstance. Random Circumstance. What does probability mean?? Goals in this chapter
Homework (due Wed, Oct 27) Chapter 7: #17, 27, 28 Announcements: Midterm exams keys on web. (For a few hours the answer to MC#1 was incorrect on Version A.) No grade disputes now. Will have a chance to
More informationNetwork Planning and Analysis
46 Network Planning and Analysis 1. Objective: What can you tell me about the project? When will the project finish? How long will the project take (project total duration)? 2. Why is this topic Important
More informationDecision Trees and Networks
Lecture 21: Uncertainty 6 Today s Lecture Victor R. Lesser CMPSCI 683 Fall 2010 Decision Trees and Networks Decision Trees A decision tree is an explicit representation of all the possible scenarios from
More informationChapter 3. Cartesian Products and Relations. 3.1 Cartesian Products
Chapter 3 Cartesian Products and Relations The material in this chapter is the first real encounter with abstraction. Relations are very general thing they are a special type of subset. After introducing
More informationPart III: Machine Learning. CS 188: Artificial Intelligence. Machine Learning This Set of Slides. Parameter Estimation. Estimation: Smoothing
CS 188: Artificial Intelligence Lecture 20: Dynamic Bayes Nets, Naïve Bayes Pieter Abbeel UC Berkeley Slides adapted from Dan Klein. Part III: Machine Learning Up until now: how to reason in a model and
More informationBayesian Networks. Mausam (Slides by UW-AI faculty)
Bayesian Networks Mausam (Slides by UW-AI faculty) Bayes Nets In general, joint distribution P over set of variables (X 1 x... x X n ) requires exponential space for representation & inference BNs provide
More informationDescriptive Statistics and Measurement Scales
Descriptive Statistics 1 Descriptive Statistics and Measurement Scales Descriptive statistics are used to describe the basic features of the data in a study. They provide simple summaries about the sample
More informationLecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett
Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.
More informationMath Games For Skills and Concepts
Math Games p.1 Math Games For Skills and Concepts Original material 2001-2006, John Golden, GVSU permission granted for educational use Other material copyright: Investigations in Number, Data and Space,
More informationDiscrete Math in Computer Science Homework 7 Solutions (Max Points: 80)
Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice,
More informationCh. 13.3: More about Probability
Ch. 13.3: More about Probability Complementary Probabilities Given any event, E, of some sample space, U, of a random experiment, we can always talk about the complement, E, of that event: this is the
More informationBasic Probability. Probability: The part of Mathematics devoted to quantify uncertainty
AMS 5 PROBABILITY Basic Probability Probability: The part of Mathematics devoted to quantify uncertainty Frequency Theory Bayesian Theory Game: Playing Backgammon. The chance of getting (6,6) is 1/36.
More informationAttribution. Modified from Stuart Russell s slides (Berkeley) Parts of the slides are inspired by Dan Klein s lecture material for CS 188 (Berkeley)
Machine Learning 1 Attribution Modified from Stuart Russell s slides (Berkeley) Parts of the slides are inspired by Dan Klein s lecture material for CS 188 (Berkeley) 2 Outline Inductive learning Decision
More informationCurriculum Design for Mathematic Lesson Probability
Curriculum Design for Mathematic Lesson Probability This curriculum design is for the 8th grade students who are going to learn Probability and trying to show the easiest way for them to go into this class.
More informationV. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE
V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPETED VALUE A game of chance featured at an amusement park is played as follows: You pay $ to play. A penny and a nickel are flipped. You win $ if either
More informationIn the situations that we will encounter, we may generally calculate the probability of an event
What does it mean for something to be random? An event is called random if the process which produces the outcome is sufficiently complicated that we are unable to predict the precise result and are instead
More informationChapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,
More information6.3 Conditional Probability and Independence
222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted
More informationACMS 10140 Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Answers
ACMS 10140 Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Answers Name DO NOT remove this answer page. DO turn in the entire exam. Make sure that you have all ten (10) pages
More informationUnit 13 Handling data. Year 4. Five daily lessons. Autumn term. Unit Objectives. Link Objectives
Unit 13 Handling data Five daily lessons Year 4 Autumn term (Key objectives in bold) Unit Objectives Year 4 Solve a problem by collecting quickly, organising, Pages 114-117 representing and interpreting
More informationBayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom
1 Learning Goals Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1. Be able to apply Bayes theorem to compute probabilities. 2. Be able to identify
More informationMBA 611 STATISTICS AND QUANTITATIVE METHODS
MBA 611 STATISTICS AND QUANTITATIVE METHODS Part I. Review of Basic Statistics (Chapters 1-11) A. Introduction (Chapter 1) Uncertainty: Decisions are often based on incomplete information from uncertain
More informationProbability: The Study of Randomness Randomness and Probability Models. IPS Chapters 4 Sections 4.1 4.2
Probability: The Study of Randomness Randomness and Probability Models IPS Chapters 4 Sections 4.1 4.2 Chapter 4 Overview Key Concepts Random Experiment/Process Sample Space Events Probability Models Probability
More informationSimple Decision Making
Decision Making Under Uncertainty Ch 16 Simple Decision Making Many environments have multiple possible outcomes Some of these outcomes may be good; others may be bad Some may be very likely; others unlikely
More informationThe overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS 0 400 800 1200 1600 NUMBER OF TOSSES
INTRODUCTION TO CHANCE VARIABILITY WHAT DOES THE LAW OF AVERAGES SAY? 4 coins were tossed 1600 times each, and the chance error number of heads half the number of tosses was plotted against the number
More informationSection 6.1 Discrete Random variables Probability Distribution
Section 6.1 Discrete Random variables Probability Distribution Definitions a) Random variable is a variable whose values are determined by chance. b) Discrete Probability distribution consists of the values
More informationAssessment Management
Facts Using Doubles Objective To provide opportunities for children to explore and practice doubles-plus-1 and doubles-plus-2 facts, as well as review strategies for solving other addition facts. www.everydaymathonline.com
More informationBasic Probability Concepts
page 1 Chapter 1 Basic Probability Concepts 1.1 Sample and Event Spaces 1.1.1 Sample Space A probabilistic (or statistical) experiment has the following characteristics: (a) the set of all possible outcomes
More informationChapter 3. Probability
Chapter 3 Probability Every Day, each us makes decisions based on uncertainty. Should you buy an extended warranty for your new DVD player? It depends on the likelihood that it will fail during the warranty.
More informationThe Easy Picture Guide to banking xxxx. Choosing xxxxxxxxxxxxxxxxxxxxx a xxxxxxxxxxxxxxxxxxxxx. bank account
The Easy Picture Guide to banking xxxx Choosing xxxxxxxxxxxxxxxxxxxxx and opening a xxxxxxxxxxxxxxxxxxxxx bank account The Easy Picture Guide to xxxx a bank account The Easy Picture Guide to Money for
More informationMATHEMATICS: REPEATING AND GROWING PATTERNS First Grade. Kelsey McMahan. Winter 2012 Creative Learning Experiences
MATHEMATICS: REPEATING AND GROWING PATTERNS Kelsey McMahan Winter 2012 Creative Learning Experiences Without the arts, education is ineffective. Students learn more and remember it longer when they are
More informationACMS 10140 Section 02 Elements of Statistics October 28, 2010. Midterm Examination II
ACMS 10140 Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Name DO NOT remove this answer page. DO turn in the entire exam. Make sure that you have all ten (10) pages of the examination
More informationHomework 8 Solutions
CSE 21 - Winter 2014 Homework Homework 8 Solutions 1 Of 330 male and 270 female employees at the Flagstaff Mall, 210 of the men and 180 of the women are on flex-time (flexible working hours). Given that
More informationDecision Making under Uncertainty
6.825 Techniques in Artificial Intelligence Decision Making under Uncertainty How to make one decision in the face of uncertainty Lecture 19 1 In the next two lectures, we ll look at the question of how
More informationStatistics in Geophysics: Introduction and Probability Theory
Statistics in Geophysics: Introduction and Steffen Unkel Department of Statistics Ludwig-Maximilians-University Munich, Germany Winter Term 2013/14 1/32 What is Statistics? Introduction Statistics is the
More informationChapter 13 & 14 - Probability PART
Chapter 13 & 14 - Probability PART IV : PROBABILITY Dr. Joseph Brennan Math 148, BU Dr. Joseph Brennan (Math 148, BU) Chapter 13 & 14 - Probability 1 / 91 Why Should We Learn Probability Theory? Dr. Joseph
More informationMachine Learning. CS 188: Artificial Intelligence Naïve Bayes. Example: Digit Recognition. Other Classification Tasks
CS 188: Artificial Intelligence Naïve Bayes Machine Learning Up until now: how use a model to make optimal decisions Machine learning: how to acquire a model from data / experience Learning parameters
More informationThe Calculus of Probability
The Calculus of Probability Let A and B be events in a sample space S. Partition rule: P(A) = P(A B) + P(A B ) Example: Roll a pair of fair dice P(Total of 10) = P(Total of 10 and double) + P(Total of
More informationMATH 140 Lab 4: Probability and the Standard Normal Distribution
MATH 140 Lab 4: Probability and the Standard Normal Distribution Problem 1. Flipping a Coin Problem In this problem, we want to simualte the process of flipping a fair coin 1000 times. Note that the outcomes
More informationIntroductory Probability. MATH 107: Finite Mathematics University of Louisville. March 5, 2014
Introductory Probability MATH 07: Finite Mathematics University of Louisville March 5, 204 What is probability? Counting and probability 2 / 3 Probability in our daily lives We see chances, odds, and probabilities
More informationProbability Distributions
CHAPTER 5 Probability Distributions CHAPTER OUTLINE 5.1 Probability Distribution of a Discrete Random Variable 5.2 Mean and Standard Deviation of a Probability Distribution 5.3 The Binomial Distribution
More informationExCEL Management System (EMS)
ExCEL Management System (EMS) User Guide v1.0 Updated 7/20/12 Table of Contents Setting Up Activities Add Activity 3 Naming Activities 4-5 General Info Page 6 Edit Schedule 7 Setting the Service Term 8
More informationClass Registration 101
Class Registration 101 This guide will teach you the process for the following: Finding out who your advisor is/obtaining your registration PIN Finding out when your enrollment dates are Checking your
More informationBig Data and Scripting
Big Data and Scripting 1, 2, Big Data and Scripting - abstract/organization contents introduction to Big Data and involved techniques schedule 2 lectures (Mon 1:30 pm, M628 and Thu 10 am F420) 2 tutorials
More informationPolynomial Invariants
Polynomial Invariants Dylan Wilson October 9, 2014 (1) Today we will be interested in the following Question 1.1. What are all the possible polynomials in two variables f(x, y) such that f(x, y) = f(y,
More informationKnowledge-based systems and the need for learning
Knowledge-based systems and the need for learning The implementation of a knowledge-based system can be quite difficult. Furthermore, the process of reasoning with that knowledge can be quite slow. This
More informationcalculating probabilities
4 calculating probabilities Taking Chances What s the probability he s remembered I m allergic to non-precious metals? Life is full of uncertainty. Sometimes it can be impossible to say what will happen
More informationWeek 2: Conditional Probability and Bayes formula
Week 2: Conditional Probability and Bayes formula We ask the following question: suppose we know that a certain event B has occurred. How does this impact the probability of some other A. This question
More informationMath Journal HMH Mega Math. itools Number
Lesson 1.1 Algebra Number Patterns CC.3.OA.9 Identify arithmetic patterns (including patterns in the addition table or multiplication table), and explain them using properties of operations. Identify and
More informationSelf-Improving Supply Chains
Self-Improving Supply Chains Cyrus Hadavi Ph.D. Adexa, Inc. All Rights Reserved January 4, 2016 Self-Improving Supply Chains Imagine a world where supply chain planning systems can mold themselves into
More informationGetting the most from your solar panels
Getting the most from your solar panels Your landlord has installed solar electricity panels on your roof. This guide will show you how the panels work and help you get the most out of the electricity
More informationAn Introduction to Information Theory
An Introduction to Information Theory Carlton Downey November 12, 2013 INTRODUCTION Today s recitation will be an introduction to Information Theory Information theory studies the quantification of Information
More informationLTTC English Grammar Proficiency Test Grade 2
LTTC English Grammar Proficiency Test Grade 2 A. Short Comprehension The candidate is expected to demonstrate the ability to understand the passage (around 50 words) and answer the questions. B. Usage
More informationHis name is Nelson Mandela. He comes from South Africa. He is 86 years old. He was born in 1918. * * * sf Why have you got an umbrella?
English Tips-WH questions What is his name? His name is Nelson Mandela. Where does he come from? He comes from South Africa. How old is he? When was he born? He is 86 years old. He was born in 1918. *
More informationIncorporating Evidence in Bayesian networks with the Select Operator
Incorporating Evidence in Bayesian networks with the Select Operator C.J. Butz and F. Fang Department of Computer Science, University of Regina Regina, Saskatchewan, Canada SAS 0A2 {butz, fang11fa}@cs.uregina.ca
More informationCalifornia Treasures High-Frequency Words Scope and Sequence K-3
California Treasures High-Frequency Words Scope and Sequence K-3 Words were selected using the following established frequency lists: (1) Dolch 220 (2) Fry 100 (3) American Heritage Top 150 Words in English
More informationManagerial Economics Prof. Trupti Mishra S.J.M. School of Management Indian Institute of Technology, Bombay. Lecture - 13 Consumer Behaviour (Contd )
(Refer Slide Time: 00:28) Managerial Economics Prof. Trupti Mishra S.J.M. School of Management Indian Institute of Technology, Bombay Lecture - 13 Consumer Behaviour (Contd ) We will continue our discussion
More informationMATH 10034 Fundamental Mathematics IV
MATH 0034 Fundamental Mathematics IV http://www.math.kent.edu/ebooks/0034/funmath4.pdf Department of Mathematical Sciences Kent State University January 2, 2009 ii Contents To the Instructor v Polynomials.
More informationWhat Have I Learned In This Class?
xxx Lesson 26 Learning Skills Review What Have I Learned In This Class? Overview: The Learning Skills review focuses on what a learner has learned during Learning Skills. More importantly this lesson gives
More informationAIMS Education Foundation
Developed and Published by AIMS Education Foundation This book contains materials developed by the AIMS Education Foundation. AIMS (Activities Integrating Mathematics and Science) began in 1981 with a
More informationPigeonhole Principle Solutions
Pigeonhole Principle Solutions 1. Show that if we take n + 1 numbers from the set {1, 2,..., 2n}, then some pair of numbers will have no factors in common. Solution: Note that consecutive numbers (such
More informationChapter 5 A Survey of Probability Concepts
Chapter 5 A Survey of Probability Concepts True/False 1. Based on a classical approach, the probability of an event is defined as the number of favorable outcomes divided by the total number of possible
More informationDecision Making Under Uncertainty. Professor Peter Cramton Economics 300
Decision Making Under Uncertainty Professor Peter Cramton Economics 300 Uncertainty Consumers and firms are usually uncertain about the payoffs from their choices Example 1: A farmer chooses to cultivate
More informationMULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.
STATISTICS/GRACEY PRACTICE TEST/EXAM 2 MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Identify the given random variable as being discrete or continuous.
More informationConditional Probability, Independence and Bayes Theorem Class 3, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom
Conditional Probability, Independence and Bayes Theorem Class 3, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of conditional probability and independence
More informationClock Arithmetic and Modular Systems Clock Arithmetic The introduction to Chapter 4 described a mathematical system
CHAPTER Number Theory FIGURE FIGURE FIGURE Plus hours Plus hours Plus hours + = + = + = FIGURE. Clock Arithmetic and Modular Systems Clock Arithmetic The introduction to Chapter described a mathematical
More informationSession 7 Bivariate Data and Analysis
Session 7 Bivariate Data and Analysis Key Terms for This Session Previously Introduced mean standard deviation New in This Session association bivariate analysis contingency table co-variation least squares
More informationLinear Threshold Units
Linear Threshold Units w x hx (... w n x n w We assume that each feature x j and each weight w j is a real number (we will relax this later) We will study three different algorithms for learning linear
More informationIntelligent Agents. Based on An Introduction to MultiAgent Systems and slides by Michael Wooldridge
Intelligent Agents Based on An Introduction to MultiAgent Systems and slides by Michael Wooldridge Denition of an Agent An agent is a computer system capable of autonomous action in some environment, in
More information1 Uncertainty and Preferences
In this chapter, we present the theory of consumer preferences on risky outcomes. The theory is then applied to study the demand for insurance. Consider the following story. John wants to mail a package
More informationSudoku puzzles and how to solve them
Sudoku puzzles and how to solve them Andries E. Brouwer 2006-05-31 1 Sudoku Figure 1: Two puzzles the second one is difficult A Sudoku puzzle (of classical type ) consists of a 9-by-9 matrix partitioned
More information