Recursive Estimation

Size: px
Start display at page:

Download "Recursive Estimation"

Transcription

1 Recursive Estimation Raffaello D Andrea Spring 04 Problem Set : Bayes Theorem and Bayesian Tracking Last updated: March 8, 05 Notes: Notation: Unlessotherwisenoted,x, y,andz denoterandomvariables, f x (x)(ortheshort hand f(x)) denotes the probability density function of x, and f x y (x y) (or f(x y)) denotes the conditional probability density function of x conditioned on y. The expected value is denoted by E[ ], the variance is denoted by Var[ ], and Pr(Z) denotes the probability that the event Z occurs. A normally distributed random variable x with mean µ and variance σ is denoted by x N ( µ,σ ). Please report any errors found in this problem set to the teaching assistants (michaemu@ethz.ch or mwm@ethz.ch).

2 Problem Set Problem Mr. Jones has devised a gambling system for winning at roulette. When he bets, he bets on red, and places a bet only when the ten previous spins of the roulette have landed on a black number. He reasons that his chance of winning is quite large since the probability of eleven consecutive spins resulting in black is quite small. What do you think of this system? Problem Consider two boxes, one containing one black and one white marble, the other, two black and one white marble. A box is selected at random with equal probability and a marble is drawn at random with equal probability from the selected box. What is the probability that the marble is black? Problem In Problem, what is the probability that the first box was the one selected given that the marble is white? Problem 4 Urn contains two white balls and one black ball, while urn contains one white ball and five black balls. One ball is drawn at random with equal probability from urn and placed in urn. A ball is then drawn from urn at random with equal probability. It happens to be white. What is the probability that the transferred ball was white? Problem 5 Stores A, B and C have 50, 75, 00 employees, and respectively 50, 60 and 70 percent of these are women. Resignations are equally likely among all employees, regardless of sex. One employee resigns and this is a woman. What is the probability that she works in store C? Problem 6 a) A gambler has in his pocket a fair coin and a two-headed coin. He selects one of the coins at random with equal probability, and when he flips it, it shows heads. What is the probability that it is the fair coin? b) Suppose that he flips the same coin a second time and again it shows heads. What is now the probability that it is the fair coin? c) Suppose that he flips the same coin a third time and it shows tails. What is now the probability that it is the fair coin? Problem 7 Urn has five white and seven black balls. Urn has three white and twelve black balls. An urn is selected at random with equal probability and a ball is drawn at random with equal probability from that urn. Suppose that a white ball is drawn. What is the probability that the second urn was selected?

3 Problem 8 An urn contains b black balls and r red balls. One of the balls is drawn at random with equal probability, but when it is put back into the urn c additional balls of the same color are put in with it. Now suppose that we draw another ball at random with equal probability. Show that the probability that the first ball drawn was black, given that the second ball drawn was red is b/(b+r +c). Problem 9 Three prisoners are informed by their jailer that one of them has been chosen to be executed at random with equal probability, and the other two are to be freed. Prisoner A asks the jailer to tell him privately which of his fellow prisoners will be set free, claiming that there would be no harm in divulging this information, since he already knows that at least one will go free. The jailer refuses to answer this question, pointing out that if A knew which of his fellows were to be set free, then his own probability of being executed would rise from / to /, since he would then be one of two prisoners. What do you think of the jailer s reasoning? Problem 0 Let x and y be independent random variables. Let g( ) and h( ) be arbitrary functions of x and y, respectively. Define the random variables v g(x) and w h(y). Prove that v and w are independent. That is, functions of independent random variables are independent. Problem Let x be a continuous, uniformly distributed random variable with x X [ 5,5]. Let z x+n z x+n, where n and n are continuous random variables with probability density functions α (+n ) for n 0 f n (n ) α ( n ) for 0 n 0 otherwise, ( α + n ) for n 0 ( f n (n ) α n ) for 0 n 0 otherwise, where α and α are normalization constants. Assume that the random variables x, n, n are independent, i.e. f(x,n,n ) f(x)f(n )f(n ). a) Calculate α and α. b) Use the change of variables formula from Lecture to show that f zi x(z i x) f ni (z i x). c) Calculate f(x z 0,z 0). d) Calculate f(x z 0,z ). e) Calculate f(x z 0,z ).

4 Problem Consider the following estimation problem: an object B moves randomly on a circle with radius. The distance to the object can be measured from a given observation point P. The goal is to estimate the location of the object, see Figure. y B distance - θ L P x Figure : Illustration of the estimation problem. The object B can only move in discrete steps. The object s location at time k is given by x(k) {0,,...,N }, where θ(k) π x(k) N. The dynamics are x(k) mod (x(k )+v(k), N), k,,..., where v(k) with probability p and v(k) otherwise. Note that mod (N, N) 0 and mod (, N) N. The distance sensor measures z(k) (L cosθ(k)) +(sinθ(k)) +w(k), where w(k) represents the sensor error which is uniformly distributed on [ e, e]. We assume that x(0) is uniformly distributed and x(0), v(k) and w(k) are independent. Simulate the object movement and implement a Bayesian tracking algorithm that calculates for each time step k the probability density function f(x(k) z( : k)). a) Test the following settings and discuss the results: N 00, x(0) N 4, e 0.5, (i) L, p 0.5, (ii) L, p 0.5, (iii) L, p 0.55, (iv) L 0., p 0.55, (v) L 0, p b) How robust is the algorithm? Set N 00, x(0) N 4, e 0.5, L, p 0.55 in the simulation, but use slightly different values for p and e in your estimation algorithm, ˆp and ê, respectively. Test the algorithm and explain the result for: (i) ˆp 0.45, ê e, (ii) ˆp 0.5, ê e, (iii) ˆp 0.9, ê e, (iv) ˆp p, ê 0.9, (v) ˆp p, ê

5 Sample solutions Problem We introduce a discrete random variable x i, representing the outcome of the ith spin with x i {red,black}. We assume that both are equally likely and ignore the fact that there is a 0 or a 00 on a Roulette board. Assuming independence between spins, we have Pr(x k,x k,x k,...,x ) Pr(x k ) Pr(x k )...Pr(x ). The probability of eleven consecutive spins resulting in black is Pr(x black,x 0 black,x 9 black,...,x black) ( ) This value is actually quite small. However, given that the previous ten were black, we calculate Pr(x black x 0 black,x 9 black,...,x black) (by independence assumption) Pr(x black) Pr(x red x 0 black,x 9 black,...,x black) i.e. it is equally likely that ten black spins in a row are followed by a red spin, as that they are followed by another black spin (by the independence assumption). Mr. Jones system is therefore no better than randomly betting on black or red. Problem We introduce two discrete random variables. Let x {,} represent which box is chosen (box or ) with probability f x () f x (). Furthermore, let y {b,w} represent the color of the drawn marble, where b is a black and w a white marble with probabilities f y x (b ) f y x (w ), f y x (b ), f y x(w ). Then, by the total probability theorem, we find f y (b) f y x (b ) f x ()+f y x (b ) f x () + 7. Problem f x y ( w) f y x(w ) f x () f y (w) f y (b)

6 Problem 4 Let x {b,w} represent the color of the ball drawn from urn, where f x (b), f x(w), and y {b,w} be the color of the ball subsequently drawn from urn. Considering the different possibilities, we have f y x (b b) 6 7 f y x (w b) 7 f y x (b w) 5 7 f y x (w w) 7. We seek the probability that the transferred ball was white, given that the second ball drawn is white and calculate f x y (w w) f y x(w w) f x (w) f y (w) f y x (w b) f x (b)+f y x (w w) f x (w) Problem 5 We introduce two discrete random variables, x {A,B,C} and y {M,F}, where x represents which store an employee works in, and y the sex of the employee. From the problem description, we have f x (A) f x (B) 75 5 f x (C) , and the probability that an employee is a woman is f y x (F A) f y x (F B) 5 f y x (F C) 7 0. We seek the probability that the resigning employee works in store C, given that it is a woman and calculate f x y (C F) f y x(f C) f x (C) f y (F) i {A,B,C} f y x (F i) f x (i) Problem 6 a) Let x {F,U} represent whether it is a fair (F) or an unfair (U) coin with f x (F) f x (U). We introduce y {h,t} to represent how the toss comes up (heads or tails) with f y x (h F) f y x (t F) f y x (h U) f y x (t U) 0. 6

7 We seek the probability that the drawn coin is fair, given that the toss result is heads and calculate f x y (F h) f y x(h F) f x (F) f y (h) f y x (h F) f x (F) } {{ } fair coin +f y x (h U) f x (U) } {{ } unfair coin 4 + b) Let y represent the result of the first flip and y that of the second flip. We assume conditional independence between flips (conditioned on x), yielding f y,y x(y,y x) f y x(y x) f y x(y x).. We seek the probability that the drawn coin is fair, given that the both tosses resulted in heads and calculate f x y,y (F h,h) f y,y x(h,h F) f x (F) f y,y (h,h) ( ) } {{ } fair coin + }{{} unfair coin 5. c) Obviously, the probability then is, because the unfair coin cannot show tails. Formally, we show this by introducing y to represent the result of the third flip and computing f x y,y,y (F h,h,t) f y,y,y x(h,h,t F) f x (F) f y,y,y (h,h,t) ( ) } {{ } fair coin + 0 } {{ } unfair coin. Problem 7 We introduce the following two random variables: x {,} represents which box is chosen (box or ) with probability f x () f x (), y {b,w} represents the color of the drawn ball, where b is a black and w a white ball with probabilities f y x (b ) 7 f y x (b ) 5 f y x (w ) 5 f y x (w ) 5. We seek the probability that the second box was selected, given that the ball drawn is white and calculate f x y ( w) f y x(w )f x () f y (w)

8 Problem 8 Let x {B,R} be the color (black or red) of the first ball drawn with f x (B) b f x (R) r b+r ; and let x {B,R} be the color of the second ball with b+r and b f x x (B R) b+r+c f x x (B B) b+c b+r +c f x x (R R) c+r b+r +c r f x x (R B) b+r+c. We seek the probability that the first ball drawn was black, given that the second ball drawn is red and calculate f x x (B R) f x x (R B) f x (B) f x (R) c+r b+r+c r } {{ b+r } first red r b+r+c b b+r r + b+r +c b } {{ b+r } first black b b+r +c. Problem 9 We can approach the solution in two ways: Descriptive solution: The probability that A is to be executed is, and there is a chance of that one of the others was chosen. If the jailer gives away the name of one of the fellow prisoners who will be set free, prisoner A does not get new information about his own fate, but the probability of the remaining prisoner (B or C) to be executed is now. The probability of A being executed is still. Bayesian analysis: Let x represent which prisoner is to be executed, where x {A,B,C}. We assume that it is a random choice, i.e. f x (A) f x (B) f x (C). Now let y {B,C} be the prisoner name given away by the jailer. We can now write the conditional probabilities: 0 if x y (the jailer does not lie) f y x (y x) if x A (A is to be executed, jailer mentions B and C with equal probability) if x A (jailer is forced to give the name of the other prisoner to be set free). You could also do this with a table: y x f y x (y x) B A / B B 0 B C C A / C B C C 0 To answer the question, we have to compare f x y (A ȳ), ȳ {B,C}, with f x (A): 8

9 f x y (A ȳ) f y x(ȳ A)f x (A) f y (ȳ) }{{} f y x (ȳ A) }{{} f y x (ȳ ȳ) k {A,B,C} f y x (ȳ k) f x (k) + }{{} f y x (ȳ notȳ), where(notȳ) C ifȳ B and(notȳ) B ifȳ C. Thevalueoftheposteriorprobability is the same as the prior one, f x (A). The jailer is wrong: prisoner A gets no additional information from the jailer about his own fate! See also Wikipedia: Three prisoners problem, Monty Hall problem. Problem 0 Consider the joint cumulative distribution F v,w ( v, w) Pr((v v) and (w w)) Pr((g(x) v) and (h(y) w)). We define the sets A v and A w as below: A v { x X : g(x) v } A w { y Ȳ : h(y) w}. Now we can deduce F v,w ( v, w) Pr((x A v ) and (y A w )) v, w Pr(x A v )Pr(y A w ) (by independence assumption) Pr(g(x) v)pr(h(y) w) Pr(v v)pr(w w) F v ( v)f w ( w). Therefore v and w are independent. Problem a) By definition of a PDF, the integrals of the PDFs f ni (n i ), i, must evaluate to one, which we use to find the α i. We integrate the probability density functions, see the following figure: f n f n α α α 4α n n 9

10 For n, we obtain f n ( n )d n α 0 (+ n )d n + ( α + ) 0 α. Therefore α. For n, we get 0 ( f n ( n )d n α + ) n d n + Therefore α. α ( + ) α. ( n )d n 0 ( n ) d n b) The goal is to calculate f zi x(z i x) from z i g(n i,x) : x+n i and the given PDFs f ni (n i ), i,. We apply the change of variables formula for CRVs with conditional PDFs: f zi x(z i x) f n i x(n i x) g n i (n i,x) (just think of x as a parameter that parametrizes the PDFs). The proof for this formula is analogous to the proof in the lecture notes for a change of variables for CRVs with unconditional PDFs. We find that g n i (n i,x), for all n i,x and, therefore, the fraction in the change of variables formula is well-defined for all values of n i and x. Due to the independence of n i and x, f ni x(n i x) f ni (n i ) : f zi x(z i x) f n i x(n i x) g n i (n i,x) f n i (n i ). Substituting n i z i x, we finally obtain f zi x(z i x) f ni (n i ) f ni (z i x). c) We use Bayes theorem to calculate f x z,z (x z,z ) f z,z x(z,z x)f x (x). () f z,z (z,z ) First, we calculate the prior PDF of the CRV x, which is uniformly distributed: { f x (x) 0 for 5 x 5. 0 otherwise In b), we showed by change of variables that f(z i x) f ni (z i x). Since n,n are independent, it follows that z,z are conditionally independent given x. Therefore, we can rewrite the measurement likelihood as f z,z x(z,z x) f z x(z x)f z x(z x). 0

11 We calculate the individual conditional PDFs f zi x(z i x): α ( z +x) for 0 z x f z x(z x) f n (z x) α (+z x) for z x 0 0 otherwise. The conditional PDF of z given x is illustrated in the following figure: f z x(z x) α x x z x+ Analogously ( α z + x) for 0 z x ( f z x(z x) f n (z x) α + z x) for z x 0 0 otherwise. Let numxbethenumeratorof thebayes rulefraction (). Given themeasurementsz 0, z 0, we find numx 0 f z x(0 x)f z x(0 x). We consider four different intervals of x: [ 5, ], [,0], [0,] and [,5]. Evaluating numx for these intervals results in: for x [ 5, ] or x [,5], numx 0 for x [,0], numx 0 α (+x)α (+ x ) for x [0,], numx 0 α ( x)α ( x ) (+ 0 (+x) x ) ( 0 ( x) x ). Finally, we need to calculate the denominator of the Bayes rule fraction (), the normalization constant, which can be calculated using the total probability theorem: f z,z (z,z ) f z,z x(z,z x)f x (x)dx Evaluated at z z 0, we find f z,z (0,0) 0 ( (+x) + x ) dx+ 0 0 ( ) 4. 0 numx dx. ( ( x) x ) dx

12 Finally, we obtain the posterior PDF f x z,z (x 0,0) numx 4 numx f z,z (0,0) 0 for 5 x, x (+x)( + x ) for x ( x)( x ) for 0 x which is illustrated in the following figure: f x z,z (x 0,0) 6 5 x The posterior PDF is symmetric about x 0 with a maximum at x 0: both sensors agree. d) Similar to the above. Given z 0, z, we write the numerator as numx 0 f z x(0 x)f z x( x). () Again, we consider the same four intervals: for x [ 5, ] or x [,5], numx 0 for x [,0], for x [0,] Normalizing yields numx ( 0 α (+x)α + ) x 40 (+x) numx ( 0 α ( x)α + ) x ( x ). 40 f z,z (0,) numxdx The solution is therefore 0 for 5 x, x 5 f x z,z (x 0,) (+x) for x 0 x for 0 x.

13 The posterior PDF is depicted in the following figure: f x z,z (x 0,) x The probability values are higher for positive x values because of the measurement z. e) We start in the same fashion: given z 0 and z, numx 0 f z x(0 x)f z x( x). However, the intervals of positive probability of f z x(0 x) and f z x( x) do not overlap, i.e. numx 0 x [ 5,5]. In other words, given our noise model for n and n, there is no chance to measure z 0 and z. Therefore, f x z,z (x 0,) is not defined. Problem The Matlab code is available on the class webpage. We notice the following: a) (i) The PDF remains bimodal for all times. The symmetry in measurements and motion makes it impossible to differentiate between the upper and lower half circle. (ii) The PDF is initially bimodal, but the bias in the particle motion causes one of the two modes to have higher values after a few time steps. (iii) We note that this sensor placement also works. Note that the resulting PDF is not as sharp because more positions explain the measurements. (iv) The resulting PDF is uniform for all k,,... With the sensor in the center, the distance measurement provides no information on the particle position (all particle positions have the same distance to the sensor). b) (i) The PDF has higher values at the position mirrored from the actual position, because the estimator model uses a value ˆp that biases the particle motion in the opposite direction. (ii) The PDF remains bimodal, even though the particle motion is biased in one direction. This is caused by the estimator assuming that the particle motion is unbiased, which makes both halves of the circle equally likely. (iii) The incorrect assumption on the motion probability ˆp causes the estimated PDF to drift away from the real particle position until it is forced back by measurements that cannot be explained otherwise. (iv) The resulting PDF is not as sharp because the larger value of ê implies that more states are possible for the given measurements. (v) The Bayesian tracking algorithm fails after a few steps (division by zero in the measurement update step). This is caused by a measurement that is impossible to explain with the present particle position PDF and the measurement error distribution defined by ê. Note that ê underestimates the possible measurement error. This leads to measurements where the true error (with uniform distribution defined by e) contains an error larger than is assumed possible in the algorithm.

Statistics 100A Homework 3 Solutions

Statistics 100A Homework 3 Solutions Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win $ for each black ball selected and we

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Discrete vs. continuous random variables Examples of continuous distributions o Uniform o Exponential o Normal Recall: A random

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin Final Mathematics 51, Section 1, Fall 24 Instructor: D.A. Levin Name YOU MUST SHOW YOUR WORK TO RECEIVE CREDIT. A CORRECT ANSWER WITHOUT SHOWING YOUR REASONING WILL NOT RECEIVE CREDIT. Problem Points Possible

More information

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008 Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the

More information

AMS 5 CHANCE VARIABILITY

AMS 5 CHANCE VARIABILITY AMS 5 CHANCE VARIABILITY The Law of Averages When tossing a fair coin the chances of tails and heads are the same: 50% and 50%. So if the coin is tossed a large number of times, the number of heads and

More information

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you

More information

Elementary Statistics and Inference. Elementary Statistics and Inference. 16 The Law of Averages (cont.) 22S:025 or 7P:025.

Elementary Statistics and Inference. Elementary Statistics and Inference. 16 The Law of Averages (cont.) 22S:025 or 7P:025. Elementary Statistics and Inference 22S:025 or 7P:025 Lecture 20 1 Elementary Statistics and Inference 22S:025 or 7P:025 Chapter 16 (cont.) 2 D. Making a Box Model Key Questions regarding box What numbers

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

Problem sets for BUEC 333 Part 1: Probability and Statistics

Problem sets for BUEC 333 Part 1: Probability and Statistics Problem sets for BUEC 333 Part 1: Probability and Statistics I will indicate the relevant exercises for each week at the end of the Wednesday lecture. Numbered exercises are back-of-chapter exercises from

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

More information

Random Variables. Chapter 2. Random Variables 1

Random Variables. Chapter 2. Random Variables 1 Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22 Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability

More information

13.0 Central Limit Theorem

13.0 Central Limit Theorem 13.0 Central Limit Theorem Discuss Midterm/Answer Questions Box Models Expected Value and Standard Error Central Limit Theorem 1 13.1 Box Models A Box Model describes a process in terms of making repeated

More information

Basics of Statistical Machine Learning

Basics of Statistical Machine Learning CS761 Spring 2013 Advanced Machine Learning Basics of Statistical Machine Learning Lecturer: Xiaojin Zhu jerryzhu@cs.wisc.edu Modern machine learning is rooted in statistics. You will find many familiar

More information

1 Formulating The Low Degree Testing Problem

1 Formulating The Low Degree Testing Problem 6.895 PCP and Hardness of Approximation MIT, Fall 2010 Lecture 5: Linearity Testing Lecturer: Dana Moshkovitz Scribe: Gregory Minton and Dana Moshkovitz In the last lecture, we proved a weak PCP Theorem,

More information

MONT 107N Understanding Randomness Solutions For Final Examination May 11, 2010

MONT 107N Understanding Randomness Solutions For Final Examination May 11, 2010 MONT 07N Understanding Randomness Solutions For Final Examination May, 00 Short Answer (a) (0) How are the EV and SE for the sum of n draws with replacement from a box computed? Solution: The EV is n times

More information

36 Odds, Expected Value, and Conditional Probability

36 Odds, Expected Value, and Conditional Probability 36 Odds, Expected Value, and Conditional Probability What s the difference between probabilities and odds? To answer this question, let s consider a game that involves rolling a die. If one gets the face

More information

Week 2: Conditional Probability and Bayes formula

Week 2: Conditional Probability and Bayes formula Week 2: Conditional Probability and Bayes formula We ask the following question: suppose we know that a certain event B has occurred. How does this impact the probability of some other A. This question

More information

3.2 Roulette and Markov Chains

3.2 Roulette and Markov Chains 238 CHAPTER 3. DISCRETE DYNAMICAL SYSTEMS WITH MANY VARIABLES 3.2 Roulette and Markov Chains In this section we will be discussing an application of systems of recursion equations called Markov Chains.

More information

Probability and Expected Value

Probability and Expected Value Probability and Expected Value This handout provides an introduction to probability and expected value. Some of you may already be familiar with some of these topics. Probability and expected value are

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Discrete Structures for Computer Science

Discrete Structures for Computer Science Discrete Structures for Computer Science Adam J. Lee adamlee@cs.pitt.edu 6111 Sennott Square Lecture #20: Bayes Theorem November 5, 2013 How can we incorporate prior knowledge? Sometimes we want to know

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Math 541: Statistical Theory II Lecturer: Songfeng Zheng Maximum Likelihood Estimation 1 Maximum Likelihood Estimation Maximum likelihood is a relatively simple method of constructing an estimator for

More information

Statistics 100A Homework 2 Solutions

Statistics 100A Homework 2 Solutions Statistics Homework Solutions Ryan Rosario Chapter 9. retail establishment accepts either the merican Express or the VIS credit card. total of percent of its customers carry an merican Express card, 6

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

If, under a given assumption, the of a particular observed is extremely. , we conclude that the is probably not

If, under a given assumption, the of a particular observed is extremely. , we conclude that the is probably not 4.1 REVIEW AND PREVIEW RARE EVENT RULE FOR INFERENTIAL STATISTICS If, under a given assumption, the of a particular observed is extremely, we conclude that the is probably not. 4.2 BASIC CONCEPTS OF PROBABILITY

More information

BX in ( u, v) basis in two ways. On the one hand, AN = u+

BX in ( u, v) basis in two ways. On the one hand, AN = u+ 1. Let f(x) = 1 x +1. Find f (6) () (the value of the sixth derivative of the function f(x) at zero). Answer: 7. We expand the given function into a Taylor series at the point x = : f(x) = 1 x + x 4 x

More information

Math 58. Rumbos Fall 2008 1. Solutions to Review Problems for Exam 2

Math 58. Rumbos Fall 2008 1. Solutions to Review Problems for Exam 2 Math 58. Rumbos Fall 2008 1 Solutions to Review Problems for Exam 2 1. For each of the following scenarios, determine whether the binomial distribution is the appropriate distribution for the random variable

More information

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as

More information

Lecture 8. Confidence intervals and the central limit theorem

Lecture 8. Confidence intervals and the central limit theorem Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1 Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2011 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields

More information

Betting systems: how not to lose your money gambling

Betting systems: how not to lose your money gambling Betting systems: how not to lose your money gambling G. Berkolaiko Department of Mathematics Texas A&M University 28 April 2007 / Mini Fair, Math Awareness Month 2007 Gambling and Games of Chance Simple

More information

Unit 4 The Bernoulli and Binomial Distributions

Unit 4 The Bernoulli and Binomial Distributions PubHlth 540 4. Bernoulli and Binomial Page 1 of 19 Unit 4 The Bernoulli and Binomial Distributions Topic 1. Review What is a Discrete Probability Distribution... 2. Statistical Expectation.. 3. The Population

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Maren Bennewitz, Diego Tipaldi, Luciano Spinello 1 Motivation Recall: Discrete filter Discretize

More information

2DI36 Statistics. 2DI36 Part II (Chapter 7 of MR)

2DI36 Statistics. 2DI36 Part II (Chapter 7 of MR) 2DI36 Statistics 2DI36 Part II (Chapter 7 of MR) What Have we Done so Far? Last time we introduced the concept of a dataset and seen how we can represent it in various ways But, how did this dataset came

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers)

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) B Bar graph a diagram representing the frequency distribution for nominal or discrete data. It consists of a sequence

More information

1 Prior Probability and Posterior Probability

1 Prior Probability and Posterior Probability Math 541: Statistical Theory II Bayesian Approach to Parameter Estimation Lecturer: Songfeng Zheng 1 Prior Probability and Posterior Probability Consider now a problem of statistical inference in which

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

Bayes and Naïve Bayes. cs534-machine Learning

Bayes and Naïve Bayes. cs534-machine Learning Bayes and aïve Bayes cs534-machine Learning Bayes Classifier Generative model learns Prediction is made by and where This is often referred to as the Bayes Classifier, because of the use of the Bayes rule

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

6.042/18.062J Mathematics for Computer Science. Expected Value I

6.042/18.062J Mathematics for Computer Science. Expected Value I 6.42/8.62J Mathematics for Computer Science Srini Devadas and Eric Lehman May 3, 25 Lecture otes Expected Value I The expectation or expected value of a random variable is a single number that tells you

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice,

More information

Nonparametric adaptive age replacement with a one-cycle criterion

Nonparametric adaptive age replacement with a one-cycle criterion Nonparametric adaptive age replacement with a one-cycle criterion P. Coolen-Schrijner, F.P.A. Coolen Department of Mathematical Sciences University of Durham, Durham, DH1 3LE, UK e-mail: Pauline.Schrijner@durham.ac.uk

More information

1 Introductory Comments. 2 Bayesian Probability

1 Introductory Comments. 2 Bayesian Probability Introductory Comments First, I would like to point out that I got this material from two sources: The first was a page from Paul Graham s website at www.paulgraham.com/ffb.html, and the second was a paper

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces

More information

arxiv:1112.0829v1 [math.pr] 5 Dec 2011

arxiv:1112.0829v1 [math.pr] 5 Dec 2011 How Not to Win a Million Dollars: A Counterexample to a Conjecture of L. Breiman Thomas P. Hayes arxiv:1112.0829v1 [math.pr] 5 Dec 2011 Abstract Consider a gambling game in which we are allowed to repeatedly

More information

THE CENTRAL LIMIT THEOREM TORONTO

THE CENTRAL LIMIT THEOREM TORONTO THE CENTRAL LIMIT THEOREM DANIEL RÜDT UNIVERSITY OF TORONTO MARCH, 2010 Contents 1 Introduction 1 2 Mathematical Background 3 3 The Central Limit Theorem 4 4 Examples 4 4.1 Roulette......................................

More information

Introduction to. Hypothesis Testing CHAPTER LEARNING OBJECTIVES. 1 Identify the four steps of hypothesis testing.

Introduction to. Hypothesis Testing CHAPTER LEARNING OBJECTIVES. 1 Identify the four steps of hypothesis testing. Introduction to Hypothesis Testing CHAPTER 8 LEARNING OBJECTIVES After reading this chapter, you should be able to: 1 Identify the four steps of hypothesis testing. 2 Define null hypothesis, alternative

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

Lecture 3: Linear methods for classification

Lecture 3: Linear methods for classification Lecture 3: Linear methods for classification Rafael A. Irizarry and Hector Corrada Bravo February, 2010 Today we describe four specific algorithms useful for classification problems: linear regression,

More information

Measurements of central tendency express whether the numbers tend to be high or low. The most common of these are:

Measurements of central tendency express whether the numbers tend to be high or low. The most common of these are: A PRIMER IN PROBABILITY This handout is intended to refresh you on the elements of probability and statistics that are relevant for econometric analysis. In order to help you prioritize the information

More information

7 CONTINUOUS PROBABILITY DISTRIBUTIONS

7 CONTINUOUS PROBABILITY DISTRIBUTIONS 7 CONTINUOUS PROBABILITY DISTRIBUTIONS Chapter 7 Continuous Probability Distributions Objectives After studying this chapter you should understand the use of continuous probability distributions and the

More information

Chapter 16. Law of averages. Chance. Example 1: rolling two dice Sum of draws. Setting up a. Example 2: American roulette. Summary.

Chapter 16. Law of averages. Chance. Example 1: rolling two dice Sum of draws. Setting up a. Example 2: American roulette. Summary. Overview Box Part V Variability The Averages Box We will look at various chance : Tossing coins, rolling, playing Sampling voters We will use something called s to analyze these. Box s help to translate

More information

Chapter 16: law of averages

Chapter 16: law of averages Chapter 16: law of averages Context................................................................... 2 Law of averages 3 Coin tossing experiment......................................................

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Chapter 4 Statistics 00A Homework 4 Solutions Ryan Rosario 39. A ball is drawn from an urn containing 3 white and 3 black balls. After the ball is drawn, it is then replaced and another ball is drawn.

More information

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written

More information

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit? ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

More information

Section 7C: The Law of Large Numbers

Section 7C: The Law of Large Numbers Section 7C: The Law of Large Numbers Example. You flip a coin 00 times. Suppose the coin is fair. How many times would you expect to get heads? tails? One would expect a fair coin to come up heads half

More information

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce

More information

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.

More information

PUTNAM TRAINING POLYNOMIALS. Exercises 1. Find a polynomial with integral coefficients whose zeros include 2 + 5.

PUTNAM TRAINING POLYNOMIALS. Exercises 1. Find a polynomial with integral coefficients whose zeros include 2 + 5. PUTNAM TRAINING POLYNOMIALS (Last updated: November 17, 2015) Remark. This is a list of exercises on polynomials. Miguel A. Lerma Exercises 1. Find a polynomial with integral coefficients whose zeros include

More information

The Kelly Betting System for Favorable Games.

The Kelly Betting System for Favorable Games. The Kelly Betting System for Favorable Games. Thomas Ferguson, Statistics Department, UCLA A Simple Example. Suppose that each day you are offered a gamble with probability 2/3 of winning and probability

More information

CHAPTER 2 Estimating Probabilities

CHAPTER 2 Estimating Probabilities CHAPTER 2 Estimating Probabilities Machine Learning Copyright c 2016. Tom M. Mitchell. All rights reserved. *DRAFT OF January 24, 2016* *PLEASE DO NOT DISTRIBUTE WITHOUT AUTHOR S PERMISSION* This is a

More information

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page Errata for ASM Exam C/4 Study Manual (Sixteenth Edition) Sorted by Page 1 Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page Practice exam 1:9, 1:22, 1:29, 9:5, and 10:8

More information

Experimental Uncertainty and Probability

Experimental Uncertainty and Probability 02/04/07 PHY310: Statistical Data Analysis 1 PHY310: Lecture 03 Experimental Uncertainty and Probability Road Map The meaning of experimental uncertainty The fundamental concepts of probability 02/04/07

More information

Example: Find the expected value of the random variable X. X 2 4 6 7 P(X) 0.3 0.2 0.1 0.4

Example: Find the expected value of the random variable X. X 2 4 6 7 P(X) 0.3 0.2 0.1 0.4 MATH 110 Test Three Outline of Test Material EXPECTED VALUE (8.5) Super easy ones (when the PDF is already given to you as a table and all you need to do is multiply down the columns and add across) Example:

More information

Logistic Regression. Jia Li. Department of Statistics The Pennsylvania State University. Logistic Regression

Logistic Regression. Jia Li. Department of Statistics The Pennsylvania State University. Logistic Regression Logistic Regression Department of Statistics The Pennsylvania State University Email: jiali@stat.psu.edu Logistic Regression Preserve linear classification boundaries. By the Bayes rule: Ĝ(x) = arg max

More information

CHAPTER FIVE. 5. Equations of Lines in R 3

CHAPTER FIVE. 5. Equations of Lines in R 3 118 CHAPTER FIVE 5. Equations of Lines in R 3 In this chapter it is going to be very important to distinguish clearly between points and vectors. Frequently in the past the distinction has only been a

More information

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS

More information

The overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS 0 400 800 1200 1600 NUMBER OF TOSSES

The overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS 0 400 800 1200 1600 NUMBER OF TOSSES INTRODUCTION TO CHANCE VARIABILITY WHAT DOES THE LAW OF AVERAGES SAY? 4 coins were tossed 1600 times each, and the chance error number of heads half the number of tosses was plotted against the number

More information

Factoring Patterns in the Gaussian Plane

Factoring Patterns in the Gaussian Plane Factoring Patterns in the Gaussian Plane Steve Phelps Introduction This paper describes discoveries made at the Park City Mathematics Institute, 00, as well as some proofs. Before the summer I understood

More information

Chapter 6. 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52.

Chapter 6. 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52. Chapter 6 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? 4/52. 2. What is the probability that a randomly selected integer chosen from the first 100 positive

More information

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 7 Solutions Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

More information

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPETED VALUE A game of chance featured at an amusement park is played as follows: You pay $ to play. A penny and a nickel are flipped. You win $ if either

More information

Solutions: Problems for Chapter 3. Solutions: Problems for Chapter 3

Solutions: Problems for Chapter 3. Solutions: Problems for Chapter 3 Problem A: You are dealt five cards from a standard deck. Are you more likely to be dealt two pairs or three of a kind? experiment: choose 5 cards at random from a standard deck Ω = {5-combinations of

More information

5.1 Identifying the Target Parameter

5.1 Identifying the Target Parameter University of California, Davis Department of Statistics Summer Session II Statistics 13 August 20, 2012 Date of latest update: August 20 Lecture 5: Estimation with Confidence intervals 5.1 Identifying

More information

MAS113 Introduction to Probability and Statistics

MAS113 Introduction to Probability and Statistics MAS113 Introduction to Probability and Statistics 1 Introduction 1.1 Studying probability theory There are (at least) two ways to think about the study of probability theory: 1. Probability theory is a

More information

Probabilities. Probability of a event. From Random Variables to Events. From Random Variables to Events. Probability Theory I

Probabilities. Probability of a event. From Random Variables to Events. From Random Variables to Events. Probability Theory I Victor Adamchi Danny Sleator Great Theoretical Ideas In Computer Science Probability Theory I CS 5-25 Spring 200 Lecture Feb. 6, 200 Carnegie Mellon University We will consider chance experiments with

More information

Ch. 13.3: More about Probability

Ch. 13.3: More about Probability Ch. 13.3: More about Probability Complementary Probabilities Given any event, E, of some sample space, U, of a random experiment, we can always talk about the complement, E, of that event: this is the

More information

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding

More information

Lecture 8 The Subjective Theory of Betting on Theories

Lecture 8 The Subjective Theory of Betting on Theories Lecture 8 The Subjective Theory of Betting on Theories Patrick Maher Philosophy 517 Spring 2007 Introduction The subjective theory of probability holds that the laws of probability are laws that rational

More information

Bayesian Analysis for the Social Sciences

Bayesian Analysis for the Social Sciences Bayesian Analysis for the Social Sciences Simon Jackman Stanford University http://jackman.stanford.edu/bass November 9, 2012 Simon Jackman (Stanford) Bayesian Analysis for the Social Sciences November

More information

Basic Probability. Probability: The part of Mathematics devoted to quantify uncertainty

Basic Probability. Probability: The part of Mathematics devoted to quantify uncertainty AMS 5 PROBABILITY Basic Probability Probability: The part of Mathematics devoted to quantify uncertainty Frequency Theory Bayesian Theory Game: Playing Backgammon. The chance of getting (6,6) is 1/36.

More information

Chapter 3 RANDOM VARIATE GENERATION

Chapter 3 RANDOM VARIATE GENERATION Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

Basic Probability Theory II

Basic Probability Theory II RECAP Basic Probability heory II Dr. om Ilvento FREC 408 We said the approach to establishing probabilities for events is to Define the experiment List the sample points Assign probabilities to the sample

More information

Pattern matching probabilities and paradoxes A new variation on Penney s coin game

Pattern matching probabilities and paradoxes A new variation on Penney s coin game Osaka Keidai Ronshu, Vol. 63 No. 4 November 2012 Pattern matching probabilities and paradoxes A new variation on Penney s coin game Yutaka Nishiyama Abstract This paper gives an outline of an interesting

More information

Lectures on Stochastic Processes. William G. Faris

Lectures on Stochastic Processes. William G. Faris Lectures on Stochastic Processes William G. Faris November 8, 2001 2 Contents 1 Random walk 7 1.1 Symmetric simple random walk................... 7 1.2 Simple random walk......................... 9 1.3

More information