Tenth Problem Assignment



Similar documents
by Dimitri P. Bertsekas and John N. Tsitsiklis Last updated: October 8, 2002

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

e.g. arrival of a customer to a service station or breakdown of a component in some system.

ECE302 Spring 2006 HW4 Solutions February 6,

Master s Theory Exam Spring 2006

2WB05 Simulation Lecture 8: Generating random variables

Math 461 Fall 2006 Test 2 Solutions

Probability Generating Functions

Notes on Continuous Random Variables

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

4 The M/M/1 queue. 4.1 Time-dependent behaviour

MAS108 Probability I

ECE302 Spring 2006 HW3 Solutions February 2,

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Aggregate Loss Models

The Exponential Distribution

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

), 35% use extra unleaded gas ( A

Math 431 An Introduction to Probability. Final Exam Solutions

Exponential Distribution

Chapter 5. Random variables

Practice Problems #4

5. Continuous Random Variables

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

Chapter 5 Discrete Probability Distribution. Learning objectives

Practice problems for Homework 11 - Point Estimation

Some special discrete probability distributions

M/M/1 and M/M/m Queueing Systems

ST 371 (IV): Discrete Random Variables

2. Discrete random variables

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

LECTURE - 1 INTRODUCTION TO QUEUING SYSTEM

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

Binomial lattice model for stock prices

Lecture 7: Continuous Random Variables

Network Design Performance Evaluation, and Simulation #6

Practice Problems for Homework #6. Normal distribution and Central Limit Theorem.

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Chapter 4 Lecture Notes

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = x = 12. f(x) =

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Statistics 100A Homework 7 Solutions

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

Stats on the TI 83 and TI 84 Calculator

Important Probability Distributions OPRE 6301

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.)

Joint Exam 1/P Sample Exam 1

ISyE 2030 Test 2 Solutions

Statistics 100A Homework 8 Solutions

Introduction to Probability

Maximum Likelihood Estimation

An Introduction to Queueing Theory

ISyE 6761 Fall 2012 Homework #2 Solutions

Lecture Notes 1. Brief Review of Basic Probability

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Simple Markovian Queueing Systems

Section 5.1 Continuous Random Variables: Introduction

Queueing Systems. Ivo Adan and Jacques Resing

Exploratory Data Analysis

Opgaven Onderzoeksmethoden, Onderdeel Statistiek

Normal distribution. ) 2 /2σ. 2π σ

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Lecture 2: Discrete Distributions, Normal Distributions. Chapter 1

Random variables, probability distributions, binomial random variable

Military Reliability Modeling William P. Fox, Steven B. Horton

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

6.263/16.37: Lectures 5 & 6 Introduction to Queueing Theory

Queuing Theory II 2006 Samuel L. Baker

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

Notes on the Negative Binomial Distribution

Solutions for Review Problems for Exam 2 Math You roll two fair dice. (a) Draw a tree diagram for this experiment.

1. Repetition probability theory and transforms

QUEUING THEORY. 1. Introduction

The Binomial Probability Distribution

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 5 Solutions

ECE302 Spring 2006 HW5 Solutions February 21,

Lecture 8. Confidence intervals and the central limit theorem

Project and Production Management Prof. Arun Kanda Department of Mechanical Engineering, Indian Institute of Technology, Delhi

Section 6.1 Discrete Random variables Probability Distribution


Financial Mathematics and Simulation MATH Spring 2011 Homework 2

WHERE DOES THE 10% CONDITION COME FROM?

Manufacturing Systems Modeling and Analysis

Market Maker Protection Tools. TOM MTF Derivatives

Lecture 6: Discrete & Continuous Probability and Random Variables

An Introduction to Basic Statistics and Probability

CA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction

SPARE PARTS INVENTORY SYSTEMS UNDER AN INCREASING FAILURE RATE DEMAND INTERVAL DISTRIBUTION

Name: Math 29 Probability. Practice Second Midterm Exam Show all work. You may receive partial credit for partially completed problems.

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

SOLUTIONS: 4.1 Probability Distributions and 4.2 Binomial Distributions

UNIT 2 QUEUING THEORY

1 The EOQ and Extensions

Transcription:

EECS 40 Due on April 6, 007 PROBLEM (8 points) Dave is taking a multiple-choice exam. You may assume that the number of questions is infinite. Simultaneously, but independently, his conscious and subconscious facilities are generating answers for him, each in a Poisson manner. (His conscious and subconscious are always working on different questions.) Average rate at which conscious responses are generated = λ c responses/min Average rate at which subconscious responses are generated = λ s responses/min Each conscious response is an independent Bernoulli trial with probability p c of being correct. Similarly, each subconscious response is an independent Bernoulli trial with probability p s of being correct. Dave responds only once to each question, and you can assume that his time for recording these conscious and subconscious responses is negligible. (a) Determine p K (k), the probability mass function for the number of conscious responses Dave makes in an interval of T minutes. The number of conscious responses is a Poisson random variable with parameter λ = λ c T. Thus, p k (k) = (λ ct) k e λ ct k! (b) If we pick any question to which Dave has responded, what is the probability that his answer to that question: (a) Represents a conscious response The probability of a conscious response is λ c λ c + λ s (b) Represents a subconscious response The probability of a subconscious response is λ s λ c + λ s Due on April 6, 007

(c) If we pick an interval of T minutes, what is the probability that in that interval Dave will make exactly r conscious responses and exactly s subconscious responses. minutes is The probability of making r conscious and s subconscious responsed in T (λ c T) r e λ ct r! (λ st) s e λ st s! (d) Determine the moment generating function for the probability density function for random variable X, where X is the time from the start of the exam until Dave makes his first conscious response which is preceded by at least one subconscious response. Let Y s denote the random variable when the first subconscious response is generated and Y c denote the random time starting for the first subconscious response when the first conscious response is generated. Then Thus, X = Y s + Y c M X (s) = M Ys (s) M Yc (s) = λ s s λ s λ c s λ c (e) Determine the probability mass function for the total number of responses up to and including his third conscious response. Consider the arrivals of the merged process. Each arrival belongs to the conscious process with probability λ c /(λ c + λ s ). Thus, if we only count the arrivals, then the arrivals from the conscious process form a Bernoulli process with parameter p = λ c /(λ c + λ s ). Then the number of responses ( trials ) up to and including his third conscious response ( success ) has Pascal distribution with n =, that is ( ) ( ) k ( ) k λc λs p K (k) = λ c + λ s λ c + λ s (f) The papers are to be collected as soon as Dave has completed exactly N responses. Determine: The number of responses are generated by a Poisson process with rate λ c + λ s. The correct responses are generated by a Poisson process with rate p c λ c + p s λ s. Thus each response is correct with probability (p c λ c + p s λ s )/(λ c + λ s ). (i) The expected number of questions he will answer correctly Due on April 6, 007

The expected number of questions answered correctly is the mean of a binomial random variable and equal to N p cλ c + p s λ s λ c + λ s (ii) The probability mass function for L, the number of questions he answers correctly This probability distribution of L, the number of questions answered correctly is Binomial ( N, (p c λ c + p s λ s )/(λ c + λ s ) ), that is ( ) ( ) N pc λ c + p s λ l ( s p L (l) = p ) cλ c + p s λ N l s l λ c + λ s λ c + λ s (g) Repeat part (f) for the case in which the exam papers are to be collected at the end of a fixed interval T minutes. The number of correct responses in a fixed interval T is a Poisson process with parameter (p c λ c + p s λ s )T. So the PMF of the number of correct responses is p L (l) = with mean (p c λ c + p s λ s )T. ( (pc λ c + p s λ s )T ) l e (p c λ c +p s λ s )T l! PROBLEM (6 points) All ships travel at the same speed through a wide canal. Eastbound ship arrivals at the canal are a Poisson process with an average arrival rate λ E ships per day. Westbound ships arrive as an independent Poisson process with average arrival rate λ W per day. An indicator at a point in the canal is always pointing in the direction of travel of the most recent ship to pass it. Each ship takes t days to traverse the length of the canal. (a) Given that the pointer is pointing west: (i) What is the probability that the next ship to pass it will be westbound? The direction of the next ship does not depend on the previous ships. Therefore, this is just the probability λ W /(λ E + λ W ) that the next ship is westbound. (ii) What is the PDF for the remaining time until the pointer changes direction? The pointer will change directions on the next arrival of an eastbound ship. The time until an eastbound ship arrives is an exponential random variable with parameter λ E, and its PDF is λ E e λ Et, t 0 Due on April 6, 007

(b) What is the probability that an eastbound ship will see no westbound ships during its eastward journey through the canal? Suppose that an eastbound ship enters the canal at time t 0. This ship will meet any westbound ship that entered the canal between times t 0 t and t 0 + t. Thus, the desired probability is the probability that there are no westbound ship arrivals during an interval of length t, and using the Poisson PMF, it is equal to e λ Wt (c) We begin observing at an arbitrary time. Let V be the time we have to continue observing until we see the seventh eastbound ship. Determine the PDF for V. The time until we see the seventh eastbound ship is an Erlang random variable of order 7, with parameter λ E, of the form λ 7 E t6 e λ Et PROBLEM (9 points) The number of customers N who shop at a supermarket in a day is Poisson with parameter λ. The number of items purchased by any customer is Poisson with parameter µ, and the number of items purchased by different customers are independent of each other. (a) Assume that the number of items purchased by each customer is independent of N. Find E[S] and Var(S), where S is the total number of items sold. 6! Let X i be the number of items bought by the ith customer. Then which is the random sum of a random number of random variables. Thus, and S = N i=0 X i E[S] = E[N]E[X] = λµ Var(S) = E[N] Var(X) + ( E[X] ) Var(N) = λµ + λµ (b) The supermarket has two advertising strategies, one can increase λ by 0% and the other increases µ by 0%. What are the effects of these two strategies on the mean and variance of S? Which is the better strategy? 4 Due on April 6, 007

The better strategy will be one which keeps the variance low. (i) Increase µ by 0%. The variance becomes.λµ +.λµ. (ii) Increase λ by 0%. The variance becomes.λµ +.λµ. Thus we have to compare.λµ +.λµ.λµ +.λµ The R.H.S. is smaller so we choose option (b), i.e., increase λ by 0%. (c) Because of congestion, when there are more customers around, the amount of time each customer spends in a store tends to be shorter and hence they will more likely buy fewer items. To incorporate that, we can revise the above model so that where there are n customers, µ = c/n, where c is some constant. (i) Is the number of items bought by a customer independent of N? No. The rate at which each customer buys items depends on the number of customers in the store. So, the number of items bought is dependent on N. (ii) Find E[S] and Var(S) in this new model. E[S] = E [ E[S N] ] = E [ E[ n i=0 X i N] ] [ = E N c ] = c. N Further, as conditioned on N, X i s are independent, we have Thus, Var(S N = n) = Var ( n X i N = n ) = i=0 n i=0 Var(X i N = n) = n c n = c Var(S) = E [ Var(S N) ] + Var ( E[S N] ) = E[c] + Var(c) = c Notice that the mean and variance are same. One would suspect that it likely that S is a Poisson random variable. In fat it is easy to check this fact by evaluating the transform of S. PROBLEM 4 (8 points) The Markov chain with transition probabilities listed below is in state immediately before the first trial. p, = p, = 0.4, p, = p, = 0.6, p, = 0., p,4 = 0., p 4, = p,6 = p 6,4 =.0 Due on April 6, 007

(a) Draw the state-transition diagram for this Markov Chain. Indicate which states, if any, are recurrent, transient, and periodic. Recurrent States:,, 4,, 6 Transient States: Periodic States: 4,, 6 0.4 0.4 0. 0.6 0. 0. 4 6 0.6 (b) Find the probability that the process is in state after n trials. If the process leaves state, it can never return back to it. Thus the probability that the process is in state is the same as the probability that the process remains in state for all times until n. That is, the probability is 0. n. (c) Find the expected number of trials up to and including the trial on which the process leaves state. Let N be the trial on which the process leaves state. From the previous part, we know that N is a geometric random variable with success rate p = 0.8 (because given that we are in state, we will leave with probability 0.8). Thus, we have for the expected value of N: E [N] = n0. n 0.8 = 4 n= (d) Find the probability that the process never enters state. The process cannot stay in state forever. At some finite time, it will make a transition to either state or 4. If the process jumps to state, it cannot stay in state forever and at some finite time it will make a transition to state. However, if the process makes a transition from state to 4, it can never return to state. Thus the probability of never entering state is the same as the probability of jumping from state to 4 (rather than state ). That is, we have: 0. 0. + 0. = 8 (e) Find the probability that the process is in state 4 after 0 trials. 6 Due on April 6, 007

The process will be in state 4 after 0 trials if and only if makes a jump from state to 4 in trials, 4, 7 or 0. The probability of this happening is: 0. + (0.) + (0.) 6 + (0.) 9 (0.) = 0. (0.) (0.) = 0.04 (f) Given that the process is in state 4 after 0 trials, find the probability that the process was in state 4 after the first trial. Let A be the event that the process is in state 4 after 0 trials and B be the event that the process was in state 4 after the first trial. Observe that B A. Thus, P(B A) = P(A B P(A) = P(B) P(A) = 0. 0.04 = 0.99 PROBLEM (0 points) (a) Buses depart from Ann Arbor to Detroit every hour on the hour. Passengers arrive according to a Poisson process of rate λ per hour. Find the expected number of passengers on a bus. (Ignore issues of limited seating.) The expected number of passengers on a bus is the expected number of arrivals of a Poisson process of rate λ per hour in an hour, hence equal to λ. (b) Now suppose that the buses are no longer operating on a deterministic schedule, but rather their interdeparture times are independent and exponentially distributed with rate µ per hour. Find the PMF for the number of buses arriving in one hour. The interdeparture time between the buses is exponential process with rate µ, hence the departure process of the buses is a Poisson process with rate µ. Thus the PMF for the number of buses arriving in one hour is p K (k) = µk e µ (c) Let us define an event at the bus stop to be either the arrival of a passenger, or the departure of a bus. With the same assumptions as in part above, find the expected number of events that occur in one hour. The event process is the merged process of two Poisson processes, hence a Poisson process with rate λ + µ per hour. Thus the expected number of events in an hour is λ + µ. k! 7 Due on April 6, 007

(d) If a passenger arrives at the bus stop and sees λ people waiting, find his/her expected time to wait until the next bus. 8 Due on April 6, 007

The interarrival time between the buses is an exponential process, and hence memoryless. The fact that there are λ people waiting gives some information about the past of the process. But as the arrival process is memoryless, this does not convey any information about the future of the process and hence the waiting time is also exponential with rate µ. Thus, the expected waiting time is /µ. (e) Find the PMF for the number of people on a bus. We are interested only in the number of passengers who arrive between the arrivals of buses. Suppose we concentrate only on arrivals and consider the arrival of buses as sucesses and the arrival of a passenger as faliure. Thus, we are interested in the number of failures between two successes of a Bernoulli process. This has a shifted geometric distribution given by ( ) λ n ( ) µ p N (n) = λ + µ λ + µ PROBLEM 6 (9 points) For a series of dependent trials, the probability of success on any given trial is given by (k + )/(k + ), where k is the number of successes in the previous three trials. Define a state description and a set of transition probabilities which allow this process to be described as a Markov chain. Draw the state transition diagram. Try to use the smallest possible number of states. Let the outcome of the previous three trials be the state. Then 000 00 00 00 0 0 0 9 Due on April 6, 007