IEOR 4106: Introduction to Operations Research: Stochastic Models. SOLUTIONS to Homework Assignment 1

Similar documents
Statistics 100A Homework 3 Solutions

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

Chapter 4 Lecture Notes

Statistics 100A Homework 2 Solutions

Statistics 100A Homework 7 Solutions

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

MAS108 Probability I

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

ST 371 (IV): Discrete Random Variables

36 Odds, Expected Value, and Conditional Probability

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

Chapter What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52.

Definition and Calculus of Probability

WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM

WHERE DOES THE 10% CONDITION COME FROM?

Chapter 5. Random variables

Practice Problems #4

Statistics 100A Homework 4 Solutions

Math 461 Fall 2006 Test 2 Solutions

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

Practice problems for Homework 11 - Point Estimation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Statistics 100A Homework 8 Solutions

Question of the Day. Key Concepts. Vocabulary. Mathematical Ideas. QuestionofDay

Math 3C Homework 3 Solutions

E3: PROBABILITY AND STATISTICS lecture notes

Random variables, probability distributions, binomial random variable

Introduction to Probability

Stats on the TI 83 and TI 84 Calculator

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Math 431 An Introduction to Probability. Final Exam Solutions

Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University

Mathematical goals. Starting points. Materials required. Time needed

Basic Probability Concepts

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Binomial Probability Distribution

Lecture Note 1 Set and Probability Theory. MIT Spring 2006 Herman Bennett

Some special discrete probability distributions

Probability. Sample space: all the possible outcomes of a probability experiment, i.e., the population of outcomes

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

ECE302 Spring 2006 HW4 Solutions February 6,

Random Variables. Chapter 2. Random Variables 1

Math Quizzes Winter 2009

The Binomial Distribution

Department of Industrial Engineering IE 202: Engineering Statistics Example Questions Spring 2012

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability

Normal distribution. ) 2 /2σ. 2π σ

5. Continuous Random Variables

MA 1125 Lecture 14 - Expected Values. Friday, February 28, Objectives: Introduce expected values.

That s Not Fair! ASSESSMENT #HSMA20. Benchmark Grades: 9-12

Lecture 6: Discrete & Continuous Probability and Random Variables

Section 6-5 Sample Spaces and Probability

Section 6.1 Discrete Random variables Probability Distribution

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

Section 6.2 Definition of Probability

An Introduction to Basic Statistics and Probability

Chapter 5. Discrete Probability Distributions

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Opgaven Onderzoeksmethoden, Onderdeel Statistiek

You flip a fair coin four times, what is the probability that you obtain three heads.

Probabilistic Strategies: Solutions

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Joint Exam 1/P Sample Exam 1

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Probability & Probability Distributions

Mind on Statistics. Chapter 8

AMS 5 CHANCE VARIABILITY

ECE302 Spring 2006 HW3 Solutions February 2,

STAT x 0 < x < 1

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Tenth Problem Assignment

Lecture 2 Binomial and Poisson Probability Distributions

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Elementary Statistics and Inference. Elementary Statistics and Inference. 16 The Law of Averages (cont.) 22S:025 or 7P:025.

2. Three dice are tossed. Find the probability of a) a sum of 4; or b) a sum greater than 4 (may use complement)

Probability and Expected Value

Law of Large Numbers. Alexandra Barbato and Craig O Connell. Honors 391A Mathematical Gems Jenia Tevelev

Section 1.3 P 1 = 1 2. = P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., =

1 Combinations, Permutations, and Elementary Probability

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

6 PROBABILITY GENERATING FUNCTIONS

6.2. Discrete Probability Distributions

Math 58. Rumbos Fall Solutions to Review Problems for Exam 2

1.1 Introduction, and Review of Probability Theory Random Variable, Range, Types of Random Variables CDF, PDF, Quantiles...

STATISTICS HIGHER SECONDARY - SECOND YEAR. Untouchability is a sin Untouchability is a crime Untouchability is inhuman

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Probability, statistics and football Franka Miriam Bru ckler Paris, 2015.

Probability: The Study of Randomness Randomness and Probability Models. IPS Chapters 4 Sections

Contemporary Mathematics- MAT 130. Probability. a) What is the probability of obtaining a number less than 4?

Basic Probability. Probability: The part of Mathematics devoted to quantify uncertainty

(b) You draw two balls from an urn and track the colors. When you start, it contains three blue balls and one red ball.

ACMS Section 02 Elements of Statistics October 28, Midterm Examination II

Sums of Independent Random Variables

Section 7C: The Law of Large Numbers

AP Statistics 7!3! 6!

Transcription:

IEOR 4106: Introduction to Operations Research: Stochastic Models SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability Models, by Sheldon Ross. Please do the six problems marked with an asterisk and turn them in next Tuesday. These problems are written out in case you do not yet have the textbook (available in the Columbia Bookstore). Extra problems are provided in case you need extra review. Solutions are provided for all these problems. CHAPTER 1 *Problem 1.3. A coin is to be tossed until a head appears twice in a row. What is the sample space for this experiment? If the coin is fair, what is the probability that it will be tossed exactly four times? The sample space is a set with countably infinitely many elements. The sample space is the set S {(e 1, e 2,..., e n ) : n 2}, where e i {H, T } (i.e., e i is either heads or tails) and, in addition, e n e n 1 H and, for 1 i n 2 (for n 3), if e i H, then necessarily e i+1 T. Here are initial elements of S: {H, H}, {T, H, H}, {T, T, H, H}, {H, T, H, H}, {T, T, T, H, H}, {T, H, T, H, H}, {H, T, T, H, H}, {T, T, T, T, H, H},... *Problem 1.21 Suppose that 5% of men and 0.25% of women are color-blind. A color-blind person is chosen at random. What is the probability of this person being male? (Assume that there are an equal number of males and females.) Let C be the event that a person chosen at random is color-blind. Let M be the event that a random person is male. Let M c be the event that a random person is female, the complement of the event M. Then P (M C) P (MC) P (C M)P (M) P (C) P (C M)P (M) + P (C M c )P (M c ) 0.05 0.5 (0.05 0.5) + (0.0025 0.5) 20 21. Problem 1.22 The calculation becomes easy if we view it in a good way. One might observe that the experiment has the form of tennis once deuce has been reached or table tennis (ping pong)

after the score of 20 20 has been reached, so this is a situation that you may well have encountered previously. Here is a good way to view it: Let successive trials consist of successive pairs of points. That is, we consider what happens after 2 plays, after 4 plays, and so on. We do that, because we observe that the game can only end after an even number of plays. Moreover, with this definition of a trial, the sequence of games becomes a sequence of Bernoulli trials (a sequence of independent and identically distributed (IID) Bernoulli random variables; see Section 2.2.1. The probability that each player wins one point on any trial is q 2p(1 p). A total of 2n points are played if the first (n 1) trials all results in each player winning one point, and the n th trial results in one of the players winning two points. We apply the independence to calculate the desired results. The probability that 2n points are needed is thus (2p(1 p)) n 1 (p 2 + (1 p) 2 ), n 1. The probability that A eventually wins is ( ) P (A wins) (2p(1 p)) n 1 p 2 n1 p 2 1 2p(1 p), using the well known property of geometric series: for 0 < x < 1. i0 x n 1 1 x Problem 1.31 Two dice are tossed, one green and one red. What is the conditional probability that the number on the green die is 6, given that the sum on the two dice is 7? Extra part: What is the conditional probability that the number on the green die is 6, given that the sum on the two dice is 10? Let S be the event that the sum is 7; let G be the event that the number on the green die is 6. Then P (G S) P (GS) P (S G)P (G) (1/6) (1/6) 6/36 1 6. Now for the extra part. Let S now be the event that the sum is 10. Then P (G S) P (GS) P (S G)P (G) (1/6) (1/6) 3/36 1 3. *Problem 1.43 2

Here is another application of Bayes Theorem, again like the examples in Section 1.6. Let i be the event that coin i is selected. As usual, let H and T denote the events head and tail. We are told that P (H i) i/10; we want to calculate the conditional probability the other way. We get P (i 5 H) P (H 5) P (H) P (H 5)P (5) 10 i1 P (H i)p (i) 5/100 10 i1 i/100 5 10 i1 i 5 55 1 11 0.090909... CHAPTER 2 Problem 2.9 The probability mass function (pmf) gives the jumps in the cumulative distribution function (cdf). Let p(b) be the pmf. Then Problem 2.20 p(b) 1/2 b 0 1/10 b 1 1/5 b 2 1/10 b 3 1/10 b 3.5 This is a multinomial distribution. Let T be the number of ordinary television sets purchsed; let C be the number of color television sets purchased; and let B be the number of customers browsing. Then P (T 2, C 2, B 1 T + C + B 5) 5! 2!2!1! (0.2)2 (0.3) 2 (0.5) 1 0.054 This could be approached by using the binomial distribution twice. First we consider the purchasers of ordinary TV s and all others. Then we divide up the others into purchasers of color TV s and browsers. That leads to the product of two binomial distributions, as follows: P (T 2, C 2, B 1 T +C+B 5) P (T 2, T c 3 T +C+B 5)P (C 2, B 1 C+B 3), yielding the same result after calculation. 3

*Problem 2.32 This problem is about approximating the binomial distribution by the Poisson distribution. Directly, the problem can be solved by the binomial distribution, but the binomial distribution with large n and small p, with np λ, is approximately Poisson with mean λ. Let N be the number of prizes won. Then, first exactly and then approximately, so that P (N k) b(k; n, p) b(k, 50, 0.01) 50! k!(50 k)! (0.01)k (0.99) 50 k P oisson(k; λ np) P oisson(k; 0.5) e 0.5 (0.5) k P (N 1) 1 P (N 0) 1 e 0.5 0.394 P (N 1) P (N 1) e 0.5 (0.5) 1 0.303 P (N 2) 1 P (N 0) P (N 1) 0.394 0.303 0.091 k! Problem 2.33 We need the total probability to be 1. Hence we must solve Elementary calculus yields or c(4/3) 1, so that c 1 c 1 (1 x 2 ) dx 1. ) (x x3 1 1 3 1 c 3/4. The cdf F is the integral of the pdf f, as on the bottom of page 34, so that y F (y) f(x) dx 3 y x 4 1(1 2 ) dx 3 (y y3 4 3 + 2 ) 3 for 1 y 1. Clearly, F (y) 0 for y < 1 and F (y) 1 for y > 1. *Problem 2.34 To be a probability density function, its integral must be 1. So we must have c 2 0 (4x 2x 2 ) dx c(2x 2 2x 3 /3) 1, 4

which implies that c 3/8. Then P (1/2 < X < 3/2) 3 8 3/2 Problem 2.39 1/2 (4x 2x 2 ) dx 11 16. EX 1 p(1) + 2 p(2) + 24 p(24) 1 2 + 2 3 + 24 6 31 6 51 6 *Problem 2.43 This problem gives practice in the use of indicator random variables, the X i here. (a) X n i1 X i so (b) EX i P (X i 1) P (ball i chosen before all black balls) 1 m + 1, EX n m + 1. The key theoretical fact here is this: The expected value of a sum of random variables is the sum of the expectations, even if the random variables are dependent. There is such dependence in this example. See pages 49-51 in the textbook for additional discussion. In contrast, the variance of a sum of random variables is the sum of the variances only under the condition that the component random variables are independent. The reason why P (ball i before all black balls) 1 m+1 is the following. Let Y be the first ball that is either ball i or a black ball. We then condition on the color of ball Y : P (ball i before all black balls) P (ball i before all black balls Y is red) P (Y is red) }{{} 1 + P (ball i before all black balls Y is black) P (Y is black) }{{} 0 P (Y is red). This probability is easy to calculate: just before we draw ball Y, all black balls and ball i are still in the urn. So the probability that ball Y is ball i is 1/(m + 1). Problem 2.50 Use definition of the variance. 5

(i) V ar(cx) E[(cX E[cX]) 2 ] E[c 2 (X EX) 2 ] c 2 E[(X EX) 2 ] c 2 V ar(x) (ii) V ar(c + X) E[(c + X E[c + X]) 2 ] E[(X EX) 2 ] V ar(x). 6