STA 247 Answers for practice problem set #1

Similar documents
Math 3C Homework 3 Solutions

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

MATH 140 Lab 4: Probability and the Standard Normal Distribution

Chapter What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52.

Binomial random variables

Math 431 An Introduction to Probability. Final Exam Solutions

Lesson 1. Basics of Probability. Principles of Mathematics 12: Explained! 314

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

Contemporary Mathematics Online Math 1030 Sample Exam I Chapters No Time Limit No Scratch Paper Calculator Allowed: Scientific

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

Mathematical Expectation

Section 7C: The Law of Large Numbers

Joint Exam 1/P Sample Exam 1

Section 6.2 Definition of Probability

Probabilistic Strategies: Solutions

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Contemporary Mathematics- MAT 130. Probability. a) What is the probability of obtaining a number less than 4?

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability

AP Statistics 7!3! 6!

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Probability & Probability Distributions

Section 6-5 Sample Spaces and Probability

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

Section 5-3 Binomial Probability Distributions

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

36 Odds, Expected Value, and Conditional Probability

Elementary Statistics and Inference. Elementary Statistics and Inference. 16 The Law of Averages (cont.) 22S:025 or 7P:025.

6.3 Conditional Probability and Independence

ST 371 (IV): Discrete Random Variables

Binomial random variables (Review)

Stat 20: Intro to Probability and Statistics

University of California, Los Angeles Department of Statistics. Random variables

6. Let X be a binomial random variable with distribution B(10, 0.6). What is the probability that X equals 8? A) (0.6) (0.4) B) 8! C) 45(0.6) (0.

Solved Problems. Chapter Probability review

STAT 319 Probability and Statistics For Engineers PROBABILITY. Engineering College, Hail University, Saudi Arabia

Hoover High School Math League. Counting and Probability

Assessment For The California Mathematics Standards Grade 6

Statistics 100A Homework 3 Solutions

2. How many ways can the letters in PHOENIX be rearranged? 7! = 5,040 ways.

Chapter 4 Lecture Notes

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Coin Flip Questions. Suppose you flip a coin five times and write down the sequence of results, like HHHHH or HTTHT.

Problem sets for BUEC 333 Part 1: Probability and Statistics

Random variables, probability distributions, binomial random variable

Statistics 100A Homework 7 Solutions

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

7.S.8 Interpret data to provide the basis for predictions and to establish

Chapter 4 - Practice Problems 1

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Section 6.1 Joint Distribution Functions

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Solutions for Review Problems for Exam 2 Math You roll two fair dice. (a) Draw a tree diagram for this experiment.

Probability --QUESTIONS-- Principles of Math 12 - Probability Practice Exam 1

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

MAS108 Probability I

The overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS NUMBER OF TOSSES

ACMS Section 02 Elements of Statistics October 28, Midterm Examination II

Lecture 2 Binomial and Poisson Probability Distributions

MA 1125 Lecture 14 - Expected Values. Friday, February 28, Objectives: Introduce expected values.

Betting systems: how not to lose your money gambling

You flip a fair coin four times, what is the probability that you obtain three heads.

Math 210 Lecture Notes: Ten Probability Review Problems

In the situations that we will encounter, we may generally calculate the probability of an event

Economics 1011a: Intermediate Microeconomics

Lecture 9: Bayesian hypothesis testing

Chicago Booth BUSINESS STATISTICS Final Exam Fall 2011

Statistics 100A Homework 8 Solutions

Ch. 13.3: More about Probability

E3: PROBABILITY AND STATISTICS lecture notes

ACMS Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Answers

Mathematical goals. Starting points. Materials required. Time needed

Review for Test 2. Chapters 4, 5 and 6

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University

ECE302 Spring 2006 HW1 Solutions January 16,

Covariance and Correlation

Probability and Expected Value

Question of the Day. Key Concepts. Vocabulary. Mathematical Ideas. QuestionofDay

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10

Chapter 16: law of averages

6th Grade Lesson Plan: Probably Probability

ECE-316 Tutorial for the week of June 1-5

Introductory Probability. MATH 107: Finite Mathematics University of Louisville. March 5, 2014

Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

That s Not Fair! ASSESSMENT #HSMA20. Benchmark Grades: 9-12

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

WHERE DOES THE 10% CONDITION COME FROM?

STAT 35A HW2 Solutions

Chapter 16. Law of averages. Chance. Example 1: rolling two dice Sum of draws. Setting up a. Example 2: American roulette. Summary.

Expected Value and the Game of Craps

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Probability definitions

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Solution to HW - 1. Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean

Lecture 13. Understanding Probability and Long-Term Expectations

Law of Large Numbers. Alexandra Barbato and Craig O Connell. Honors 391A Mathematical Gems Jenia Tevelev

Statistics 100A Homework 2 Solutions

Math 55: Discrete Mathematics

Transcription:

STA 247 Answers for practice problem set #1 Question 1: The random variable X has a range of {0, 1, 2} and the random variable Y has a range of {1, 2}. The joint distribution of X and Y is given by the following table: x y P( x,y = y) 0 1 0.2 0 2 0.1 1 1 0.0 1 2 0.2 2 1 0.3 2 2 0.2 1. Write down tables for the marginal distributions of X and of Y, i.e. give the values of P( x) for all x, and of P(Y = y) for all y. x P( x) 0 0.3 1 0.2 2 0.5 y P(Y = y) 1 0.5 2 0.5 2. Write down a table for the conditional distribution of X given that Y = 2, i.e. give the values of P( x Y = 2) for all x. x P( x Y = 2) 0 0.2 1 0.4 2 0.4 3. Compute E(X) and E(Y ). E(X) = 0 0.3 + 1 0.2 + 2 0.5 = 1.2 E(Y ) = 1 0.5 + 2 0.5 = 1.5 4. Compute E(XY ). E(XY ) = (0 1) 0.2 + (0 2) 0.1 + (1 1) 0.0 + (1 2) 0.2 + (2 1) 0.3 + (2 2) 0.2 = 1.8 1

5. Are X and Y independent? Explain why or why not. No, they are not independent, as can be seen from the fact that P( 1,Y = 1) P( 1)P(Y = 1). (The left side is 0, but the right side is 0.2 0.5 = 0.1.) Note that the fact that E(XY ) = E(X)E(Y ) does NOT imply that X and Y are independent. E(XY ) = E(X)E(Y ) is implied by X and Y being independent, but not the other way around. Question 2: You roll one red die and one green die. Define the random variables X and Y as follows: The number showing on the red die Y = The number of dice that show the number six For example, if the red and green dice show the numbers 6 and 4, then 6 and Y = 1. Write down a table showing the joint probability mass function for X and Y, find the marginal distribution for Y, and compute E(Y ). Here is a table showing the joint probability mass function, with the marginal distribution for Y on the right: 0 5/36 5/36 5/36 5/36 5/36 0 25/36 Y = 1 1/36 1/36 1/36 1/36 1/36 5/36 10/36 2 0 0 0 0 0 1/36 1/36 The expectation of Y is E(Y ) = 0 25/36 + 1 10/36 + 2 1/36 = 12/36 = 1/3 Question 3. Suppose you roll two fair, six-sided dice, one of which is red and the other of which is green. Define the following random variables: The number shown on the red die Y = 0 if the two dice show the same number 1 if the number on the green die is bigger than the number on the red die 2 if the number on the red die is bigger than the number on the green die a) Write down a table showing the joint probability mass function for X and Y. 0 1/36 1/36 1/36 1/36 1/36 1/36 Y = 1 5/36 4/36 3/36 2/36 1/36 0 2 0 1/36 2/36 3/36 4/36 5/36 2

b) Find the marginal probability mass function for Y, and compute its expected value. y P(Y = y) 0 6/36 1 /36 2 /36 E(Y ) = 0 (6/36) + 1 (/36) + 2 (/36) = 45/36 c) Find the conditional probability mass function for X given Y = 1. 5/ 4/ 3/ 2/ 1/ 0 Question 4. You flip a fair coin. If the coin lands heads, you roll a fair six-sided die 100 times. If the coin lands tails, you roll the die 101 times. Let X be 1 if the coin lands heads and 0 if the coin lands tails. Let Y be the total number of times that you roll a 6. Find P( 1 Y = ). We can find P( 1 Y = ) using Bayes Rule. It s easiest using the odds form: P( 1 Y = ) P( 0 Y = ) = P( 1) P( 0) P(Y = 1) P(Y = 0) The ratio P( 1)/P( 0) is one, since heads and tails are equally likely. The conditional probabilties P(Y = 1) and P(Y = 0) are both binomial. Given 1, Y has the binomial distribution with n = 100 and p = 1/6, so P(Y = 1) = 100 (1/6) (1 1/6) 100 Given 0, Y has the binomial distribution with n = 101 and p = 1/6. P(Y = 0) = 101 (1/6) (1 1/6) 101 The odds for 1 given Y = are therefore 100 (1/6) P( 1 Y = ) (1 1/6) 100 = 1 = P( 0 Y = ) 101 (1/6) (1 1/6) 101 86 101 6 5 = 516 505 which gives P( 1 Y = ) = (516/505)/(1 + 516/505) = 0.505... (See the week 3 lecture summary regarding odds.) 3

Question 5. Suppose you randomly pick an integer from 1 to 3, with the three possibilities being equally likely. Call this integer N. You then randomly pick an integer from N to 3, with the 4 N possibilities being equally likely. Call this second integer M. What is the probability that M will be 3? P(M = 3) = 3 P(M = 3, N = n) n=1 3 = P(M = 3 N = n)p(n = n) n=1 = (1/3) (1/3) + (1/2) (1/3) + 1 (1/3) = 11/18 Question 6. Three computers, A, B, and C, are linked by network connections as shown below: a1 b1 b2 A B C a2 b3 Two network connections, a1 and a2, link computer A and computer B. Three network connections, b1, b2, and b3, link computer B and computer C. Since there is no direct connection from computer A to computer C, messages sent from computer A to computer C must pass through computer B. When computer A sends a message to computer B, it randomly chooses whether to use connection a1 or connection a2, with probabilities of 2/3 for a1 and 1/3 for a2. When computer B sends a message to computer C, it randomly chooses whether to use connection b1, connection b2, or connection b3, with probabilities of 1/2 for b1, 1/4 for b2, and 1/4 for b3. The time to send a message through each connection is 1ms for a1, 2ms for a2, 1ms for b1, 2ms for b2, and 3ms for b3. When a message is sent from computer A to computer C through computer B, it takes no time for computer B to take the message received from connection a1 or a2 and send it on connection b1, b2, or b3. Let X be the random variable whose value is the total time (in milliseconds) taken for a message sent from computer A to computer C to arrive. a) Find P( 4). P( 4) = P(takes path a1, b3 or path a2, b2) = P(takes path a1, b3) + P(takes path a2, b2) = (2/3)(1/4) + (1/3)(1/4) = 1/4 4

b) Find E(X). This can be found by finding the probabilities for all values of X, similarly to what is done above for part (a), and then applying the definition of expectation. It can more easily be solved by letting U be the time from A to B and V be the time from B to C, so that U + V. Then E(X) = E(U) + E(V ). We can compute that E(U) = 1 (2/3)+2 (1/3) = 4/3 and E(V ) = 1 (1/2)+2 (1/4)+3 (1/4) = 7/4. So E(X) = (4/3) + (7/4) = 37/12. c) Suppose that a message sent from computer A to computer C takes 4ms or more to arrive (ie, X 4). How likely is it that this message was sent from computer B to computer C using connection b3? X 4 when the path taken is a1, b3 or a2, b2 or a2, b3. Of these, the paths a1, b3 and a2, b3 involve connection b3. The conditional probability of b3 having been used given that X 4 is therefore P(path is a1, b3 or a2, b3) P(path is a1, b3 or a2, b2 or a2, b3) = (2/3)(1/4) + (1/3)(1/4) (2/3)(1/4) + (1/3)(1/4) + (1/3)(1/4) = 3 4 5