UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 9 Spring 2006



Similar documents
Overview of some probability distributions.

Chapter 6: Variance, the law of large numbers and the Monte-Carlo method

Properties of MLE: consistency, asymptotic normality. Fisher information.

I. Chi-squared Distributions

University of California, Los Angeles Department of Statistics. Distributions related to the normal distribution

Chapter 7 - Sampling Distributions. 1 Introduction. What is statistics? It consist of three major areas:


Asymptotic Growth of Functions

Chapter 7 Methods of Finding Estimators

Case Study. Normal and t Distributions. Density Plot. Normal Distributions

Output Analysis (2, Chapters 10 &11 Law)

Soving Recurrence Relations

THE ABRACADABRA PROBLEM

Math C067 Sampling Distributions

A probabilistic proof of a binomial identity

The Stable Marriage Problem

Lecture 4: Cauchy sequences, Bolzano-Weierstrass, and the Squeeze theorem

A Recursive Formula for Moments of a Binomial Distribution

CHAPTER 7: Central Limit Theorem: CLT for Averages (Means)

How To Solve The Homewor Problem Beautifully

Hypergeometric Distributions

Definition. A variable X that takes on values X 1, X 2, X 3,...X k with respective frequencies f 1, f 2, f 3,...f k has mean

FIBONACCI NUMBERS: AN APPLICATION OF LINEAR ALGEBRA. 1. Powers of a matrix

Lecture 5: Span, linear independence, bases, and dimension

Sequences and Series

In nite Sequences. Dr. Philippe B. Laval Kennesaw State University. October 9, 2008

SAMPLE QUESTIONS FOR FINAL EXAM. (1) (2) (3) (4) Find the following using the definition of the Riemann integral: (2x + 1)dx

Section 11.3: The Integral Test

NATIONAL SENIOR CERTIFICATE GRADE 12

Hypothesis testing. Null and alternative hypotheses

Week 3 Conditional probabilities, Bayes formula, WEEK 3 page 1 Expected value of a random variable

Incremental calculation of weighted mean and variance

1 Computing the Standard Deviation of Sample Means

, a Wishart distribution with n -1 degrees of freedom and scale matrix.

Normal Distribution.

CS103X: Discrete Structures Homework 4 Solutions

CS103A Handout 23 Winter 2002 February 22, 2002 Solving Recurrence Relations

Infinite Sequences and Series

Maximum Likelihood Estimators.

PROCEEDINGS OF THE YEREVAN STATE UNIVERSITY AN ALTERNATIVE MODEL FOR BONUS-MALUS SYSTEM

Measures of Spread and Boxplots Discrete Math, Section 9.4

The following example will help us understand The Sampling Distribution of the Mean. C1 C2 C3 C4 C5 50 miles 84 miles 38 miles 120 miles 48 miles

Chapter 5: Inner Product Spaces

Sampling Distribution And Central Limit Theorem

Solutions to Selected Problems In: Pattern Classification by Duda, Hart, Stork

1. MATHEMATICAL INDUCTION

5.3. Generalized Permutations and Combinations

Department of Computer Science, University of Otago

5: Introduction to Estimation

BASIC STATISTICS. f(x 1,x 2,..., x n )=f(x 1 )f(x 2 ) f(x n )= f(x i ) (1)

Convexity, Inequalities, and Norms

2-3 The Remainder and Factor Theorems

1. C. The formula for the confidence interval for a population mean is: x t, which was

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 13

Lecture 4: Cheeger s Inequality

THE HEIGHT OF q-binary SEARCH TREES

Example 2 Find the square root of 0. The only square root of 0 is 0 (since 0 is not positive or negative, so those choices don t exist here).

SECTION 1.5 : SUMMATION NOTATION + WORK WITH SEQUENCES

THE REGRESSION MODEL IN MATRIX FORM. For simple linear regression, meaning one predictor, the model is. for i = 1, 2, 3,, n

MEI Structured Mathematics. Module Summary Sheets. Statistics 2 (Version B: reference to new book)

Research Article Sign Data Derivative Recovery

Cooley-Tukey. Tukey FFT Algorithms. FFT Algorithms. Cooley

4.3. The Integral and Comparison Tests

3 Basic Definitions of Probability Theory

.04. This means $1000 is multiplied by 1.02 five times, once for each of the remaining sixmonth

Solving Logarithms and Exponential Equations

A Mathematical Perspective on Gambling

Approximating Area under a curve with rectangles. To find the area under a curve we approximate the area using rectangles and then use limits to find

Statistical inference: example 1. Inferential Statistics

1 Review of Probability

Confidence Intervals for One Mean

Annuities Under Random Rates of Interest II By Abraham Zaks. Technion I.I.T. Haifa ISRAEL and Haifa University Haifa ISRAEL.

Unbiased Estimation. Topic Introduction

Taking DCOP to the Real World: Efficient Complete Solutions for Distributed Multi-Event Scheduling

Our aim is to show that under reasonable assumptions a given 2π-periodic function f can be represented as convergent series

Chapter 14 Nonparametric Statistics

CHAPTER 3 THE TIME VALUE OF MONEY

3. Greatest Common Divisor - Least Common Multiple

INFINITE SERIES KEITH CONRAD

5 Boolean Decision Trees (February 11)

MARTINGALES AND A BASIC APPLICATION

Vladimir N. Burkov, Dmitri A. Novikov MODELS AND METHODS OF MULTIPROJECTS MANAGEMENT

5.4 Amortization. Question 1: How do you find the present value of an annuity? Question 2: How is a loan amortized?

Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed.

Elementary Theory of Russian Roulette

Mann-Whitney U 2 Sample Test (a.k.a. Wilcoxon Rank Sum Test)

Section 8.3 : De Moivre s Theorem and Applications

10-705/ Intermediate Statistics

Center, Spread, and Shape in Inference: Claims, Caveats, and Insights

Repeating Decimals are decimal numbers that have number(s) after the decimal point that repeat in a pattern.

Partial Di erential Equations

Confidence Intervals. CI for a population mean (σ is known and n > 30 or the variable is normally distributed in the.

Ekkehart Schlicht: Economic Surplus and Derived Demand

Building Blocks Problem Related to Harmonic Series

Chapter 7: Confidence Interval and Sample Size

Transcription:

Exam format UC Bereley Departmet of Electrical Egieerig ad Computer Sciece EE 6: Probablity ad Radom Processes Solutios 9 Sprig 006 The secod midterm will be held o Wedesday May 7; CHECK the fial exam schedule You are permitted to brig a calculator, ad three sheets of had-writte otes o 85 paper Questios o the exam ca be based o ay material from Chapters through to the END of chapter 7, as discussed i lectures # through #30, covered i homewors # through #, ad all discussio sectios Review problems Problem 9 Let X ad Y be idepedet radom variables that are uiformly distributed o [0, ] (a) Fid the mea ad variace of X Y (b) Let A be the evet {X Y } Fid the coditioal PDF of X give that A occurred (A fully labeled setch will suffice) Solutio: (a) We have that E[X Y ] = E[X] E[Y ] = = ad (b) We are ased to compute Var[X Y ] = Var[X] + 4Var[Y ] = + 4 = 5 f X X Y (x) = d [ PX X Y (X x X Y ) ] dx [ ] P (X x, X Y ) = d dx P (X Y ) [ = d x x dx = ( x)[0 x ] [0 x ] ]

Problem 9 Male ad female patiets arrive at a emergecy room accordig to idepedet Poisso processes, with each process havig a rate of 3 per hour Let M t,t be the umber of male arrivals betwee t ad t Let F t,t be the umber of female arrivals betwee time t ad t (a) Write dow the PMF of M 3,5 + F 4,6 (b) Calculate the variace of M 3,5 + M 4,6 (c) What is the expectatio of the arrival time of the last patiet to arrive before 4 pm? (d) Startig from a particular time, what is the expected time util there is a arrival of at least oe male ad at least oe female patiet? Solutio: (a) Sice M 3,5 ad F 4,6 are idepedet Poisso radom variables with parameter λt = 6, M 3,5 + F 4,6 is Poisso with parameter (b) M 3,5 + M 4,6 = M 3,4 + M 4,5 + M 5,6 = X + Y + Z, where X,Y ad Z are idepedet radom variables with parameters 3 So, Var[M 3,5 +M 4,6 ] = Var[X]+4Var[Y ]+Var[Z] = 3( + 4 + ) = 8 (c) We are ased to compute the expected time of the first arrival i the merged process goig bacwards i time The merged process has parameter 6, so the expectatio of the arrival time of the last patiet to arrive before 4 pm is 4 pm - 6 hours (d) Let X m ad X f be the iterarrival times of male ad female patiets processes X m ad X f are idepedet rvs expoetially distributed with parameter 6 We are ased to compute E[Z] = E[max X m, X f ] = The calculatio is doe i more geerality i Problem 98 (d) Problem 93 Cosider a factory that produces X 0 gadgets o day The X are idepedet idetically distributed radom variables with mea 5, variace 9, E[X 3 ] = 4, ad E[X 4 ] < Furthermore, we are told that P(X = 0) > 0 (a) Fid a approximatio to the probability that the total umber of gadgets produced i 00 days is less tha 440 (b) Fid (approximately) the largest value of such that P(X + +X 00+5) 005 (c) For each defiitio of Z give below, state whether the sequece Z coverges i probability (Aswer yes or o, oly) (a) Z = X + +X (b) Z = X + +X 5 (c) Z = X + +X

(d) Z = X X X Solutio: (a) Φ ( ) ( 4395 500 30 = Φ 500 4395 ) 30 = 007 ( ) ( ) (b) P X ++X 5 3 > 995 3 = Φ 995 3 005 64 ( ) (c) P(N 0) = P(S 9 000) = P S9 5(9) 3 0005 5(9) 9 3 9 0066 (a) Yes, to E[X] (SLLN) (b) No (c) Yes, to E[X ] (SLLN) (d) Yes, to 0 ( ) = Φ 945 3 9 = Problem 94 Cosider the Marov chai specified by the followig state trasitio diagram Let X be the value of the state at time (a) For all states i = 0,,, 6, fid µ i the expected time to absorptio startig from state i, (ie eterig a recurret state) (b) Fid lim P(X = X 0 = 3) Solutio: (a) By defiitio, µ i = 0 for i = 0,,, 5, 6 I fact, states 0,,,5,6 are recurret Fially, u 3 ad u 4 ca be obtaied by solvig the system of equatios u 3 = + u 4 u 4 = + 0 u 4 from which u 3 = 5 3 ad u 4 = 4 3 (b) Fix 0,, as absorbig states, cosider the absorptio probability a i that 0,, are evetually reached startig from state i We have that lim P(X = X 0 = 3) = π a 3 where π is the steady state distributio of the recurret class 0,, Solvig the system of equatios π = π 0 π = π π 0 + π + π = 3

we get π = 7 Ad a 3 ca be obtaied solvig this system of equatios a 3 = a 4 + a 4 = 0 a 3 from which, a 3 = 5 9 Hece, lim P(X = X 0 = 3) = 5 7 9 Problem 95 The radom variable X is distributed as a biomial radom variable with parameters > ad 0 < p < The pair of radom variables (Y, Z) taes values o the set {(0, ), (, 0), (0, ), (, 0)} with equal probability if X / Similarly, (Y, Z) taes values o the set with equal probability if X > / (a) Are X ad Y idepedet? (b) Are Y ad Z idepedet? {(, ), (, ), (, ), (, )} (c) Coditioed o X beig eve, are Y ad Z idepedet? (d) Coditioed o Y =, are X ad Z idepedet? (e) Coditioed o Z = 0, are X ad Y idepedet? Solutio: (a) No Suppose eve P (Y = ) = P (Y = X < /)P (X < /)+P (Y = X /)P (X /) = 4 + But P (Y = X = ) = 4 (b) No Suppose eve P (Z = ) = P (Z = X < /)P (X < /)+P (Z = X /)P (X /) = 4 + But P (Z = Y = ) = = 3 8 = 3 8 4

(c) No Suppose eve, such that P (X < / X eve) = P (X / X eve) = / The, (d) No P (Y = X eve) = P (Y = X < /, X eve)p (X < / X eve) But P (Y = X eve, Z = ) = + P (Y = X /, X eve)p (X / X eve) = P (Y = X < /)P (X < / X eve) + P (Y = X /)P (X / X eve) = 4 + = 3 8 P (X > / Y = ) = P (Y = X > /)P (X > /) P (Y = ) = 3 However, P (X > / Y =, Z = 0) = 0 (e) No P (Y = Z = 0, X > /) = 0 However, P (Y = Z = 0) > 0 Problem 96 Cosider N + idepedet Poisso arrival processes, such that the i th process has arrival rate λ i = iλ, for i =,,, (N + ) Let N be a biomial radom variable, idepedet of all the Poisso arrival processes; E[N] = µ ad var(n) = v (a) Assume that N represets the umber of successes i idepedet Beroulli trials, each with a probability p of success Fid ad p i terms of µ ad v (b) Fid, i terms of λ, µ ad v, the mea of the umber of total arrivals from the sum of the processes, i a time iterval of legth t Solutio: (a) Sice µ = p amd v = mp( p), we get = µ µ v ad p = µ v µ (b) The total arrivals from the sum of the processes ca be writte ad Z = N+ i= Y i, where Y i is a Poisso distributio with parameter λit From the law of total expectatio, we 5

have E[Z] = E[E[Z N]] = P (N = j)e = j=0 j=0 i= [ j+ ] Y i i= j+ P (N = j) E[Y i ] ( j+ = )p j (i p) j λit j j=0 i= ( ) = λt p j j (j + )(j + ) (i p) j j=0 ( ) = λt p j (i p) j j + 3j + j j=0 [ E[N ] = λt + 3E[N] ] + [ p( p + p) = λt + 3p ] + Problem 97 Let X, X, be a iid sequece of ormal radom variables with mea µ ad variace σ i= Further, let Y =P X i for =,, (a) Usig the Chebyshev iequality, give a upper boud for P( Y E[Y ] ɛ) (b) Evaluate P( Y E[Y ] ɛ) exactly i terms of Φ, the CDF of the stadard ormal radom variable (c) Hece compute P( Y E[Y ] ɛ) ad its Chebyshev upper boud for ɛ/σ = 05, ɛ/σ = 0, ad ɛ/σ = 0 (d) For >, fid the liear least squares estimate of Y give Y = y ad its mea squared error Solutio: (a) We have Hece, by the Chebyshev iequality, var(y ) = var ( i= X i) = σ = σ P ( Y E[Y ] ɛ) var(y ) ɛ = σ ɛ 6

(b) Sice Y is a liear fuctio of idepedet ormal radom variables, it is ormal Hece ( ) Y E[Y ] P ( Y E[Y ] ɛ) = P ɛ ( ( ɛ )) = Φ var(y ) σ σ (c) ɛ/σ P ( Y E[Y ] ɛ) Chebyshev boud 05 ( Φ(05)) = ( 0695) = 0670 40 0 ( Φ(0)) = ( 0843) = 0374 0 0 ( Φ(0)) = ( 0977) = 00456 05 (d) We have Y = Y + i=+ X i Therefore, the least squares estimate of Y give Y = y is give by [ Y + i=+ E[Y Y = y] = E X ] i y + ( )µ Y = y =, which, beig a liear fuctio of y, must be equal to the liear least squares estimate Alteratively, we compute ( ) cov(y, Y ) = cov Y, Y Now, cov ( i=+ + cov X ) ( ) i, Y = cov Y, Y ( ) [( [ ]) ] Y, Y = E Y E Y (Y E[Y ]) = var(y ) Hece the liear least squares estimate of Y give Y = y is give by E[Y ] + cov(y, Y ) (y E[Y ]) = µ + var(y ) (y y + ( )µ µ) = The mea square error of the estimate is ( Y + E i=+ X ) i y + ( )µ Y = y = i=+ var(x i) = ( ) σ Problem 98 The BART is broe, so that oly two ids of vehicles go from Bereley to Sa Fracisco: taxis ad buses The iterarrival time of taxis, i miutes, is a idepedet expoetial radom variable with parameter λ, ie its PDF is f IT (t) = λ e λ t for t 0, while the iterarrival time of buses, i miutes, is a idepedet expoetial radom variable with parameter λ, ie, its PDF is f IB (t) = λ e λ t for t 0 Suppose Joe ad Harry arrive at Bereley at 7:00 am 7

(a) What is the average time before they see the first vehicle? (b) What is the probability that the first vehicle they see is a bus, ad what is the probability that the first vehicle they see is a taxi? I a taxi, the travel time to Sa Fracisco, i miutes, is a idepedet expoetial radom variable with parameter µ, ie, its PDF is f DT (t) = µ e µ t for t 0 O the other had, i a bus, the travel time to Sa Fracisco, i miutes, is a a idepedet expoetial radom variable with parameter µ, ie, its PDF is f DB (t) = µ e µ t for t 0 (c) Suppose Joe ad Harry arrive at dowtow Bereley at 7:00 am, tae the first vehicle that passes, ad arrive i Sa Fracisco X miutes later Fid the trasform of X (d) Suppose that a taxi ad a bus arrive simultaeously, ad Joe taes the taxi while Harry taes the bus Let Y be the umber of miutes from their departure from Bereley till they meet agai i Sa Fracisco Fid Exs[Y ] There are, i fact, two differet ids of buses: fast buses ad slow buses For ay bus that arrives at Bereley dowtow, it is a fast bus with probability p, ad it is a slow bus with probability p Whether a bus is fast or slow is idepedet of everythig else (e) If they stay at Bereley for l miutes, how may fast buses will they see o average? (f) If they stay idefiitely, what is the probability that they will see fast buses before they see slow buses? Solutio: (a) Vehicles arrive accordig to a Poisso process with rate λ + λ Owig to the memorylessess of the Poisso process, the time util the first vehicle after 7:00 am is distributed as a expoetial radom variable with parameter λ + λ Therefore, the average time before they see the first vehicle is /(λ + λ ) (b) By the properties of merged Poisso processes, we have P (first vehicle is a taxi) = λ λ + λ, ad P (first vehicle is a bus) = λ λ + λ (c) We have X = W + D, where W is the time util the arrival of the first vehicle, ad D is the travel time Now, M W (s) = λ + λ λ + λ s, 8

sice, as established i part (a), W is a expoetial radom variable with parameter λ + λ As for D, we have M D (s) = E [ e sd] Therefore, = E [ e sd ] λ first vehicle is a taxi + E [ e sd ] λ first vehicle is a bus λ + λ λ + λ λ λ = M DT (s) + M DB (s) = µ + µ λ + λ λ + λ µ s λ + λ µ s λ + λ M X (s) = M W (s)m D (s) = λ + λ λ + λ s ( µ λ λ λ + µ ) λ µ s λ + λ µ s λ + λ (d) We have Y = max(d T, D B ) We write Y = Y + Y, where Y = mi(d T, D B ) deotes the time till the arrival of the first vehicle, ad Y = Y Y deotes the time from the arrival of the first vehicle till the arrival of the secod vehicle First, we have The So E[Y ] = µ + µ E[Y ] = E[Y D T < D B ]P (D T < D B ) + E[Y D T D B ]P (D T D B ) = µ µ µ + µ + µ µ µ + µ E[Y ] = µ + µ ( + µ + µ ) µ µ (e) Fast buses arrive accordig to a Poisso process with rate pλ umber of fast buses they will see is lpλ Hece the average (f) The probability that exactly i slow buses arrive before the arrival of the th fast bus is the probability that the fast bus is the ( + i)th bus, which, usig the Pascal PMF, is give by ( ) + i p ( p) i The probability that they see fast buses before they see slow buses is the probability that strictly less tha slow buses arrive before the arrival of the th fast bus, which is therefore give by ( ) + i p ( p) i i=0 Alteratively, the probability of seeig fast buses before slow buses is equivalet to the probability of seeig or more fast buses i the first buses, which is give by i= ( i ) p i ( p) i 9

Problem 99 A hugry mouse is trapped i a cage with three doors At each tur, the mouse gets to ope oe of the three doors ad eat a piece of cheese behid the door if oe is preset Each door is chose w ith equal probability o each tur, regardless of whether a piece of cheese was foud o the previous tur If o cheese was foud o the previous tur, there is a probability of 3/4 that cheese will be foud behid each door o the curret tur If cheese was foud o the previous tur ad the same door is chose o the curret tur, the there is a probability of 0 that cheese will be foud; whilst if cheese was foud o the previous tur ad a differet door is chose o the curret tur, the there is a probability of that cheese will be foud (a) If you observe the mouse s behavior over 000 turs, i approximately what fractio of turs do you expect the mouse to eat a piece of cheese? (b) Suppose o cheese was foud o the previous tur What is the expected umber of turs before the mouse eats a piece of cheese? (c) Suppose o cheese was foud o the previous tur What is the expected umber of turs before the mouse eats pieces of cheese? (d) Suppose cheese was foud o the previous tur Usig the Cetral Limit Theorem, approximate the probability that the umber of turs before the mouse eats 00 pieces of cheese exceeds 5 (e) You loo ito the cage ad observe the mouse eatig a piece of cheese from behid door umber What is the probability that, if you observe the mouse three turs later, it will agai be eatig a piece of cheese from behid door umber? Solutio: (a) We model the problem with the followig two-state Marov chai, where C represets the state where cheese is foud, ad N represets the state where o cheese is foud /3 /3 C N /4 3/4 Therefore, the steady-state probabilities satisfy ad 3 π C = 3 4 π N π C + π N =, whece we coclude that π N = 4/3 ad π C = 9/3 Thus, over 000 turs, we expect that the mouse eats a piece of cheese i approximately 9/3 of them 0

(b) We wish to ow the mea first passage time t N to reach state C from state N We have t N = + 4 t N, which implies that t N = 4/3 (c) We wish to ow the mea first passage time to reach state C from state N followed by ( ) recurrece times of state C The mea recurrece time t C of state C is give by t C = + 3 t N = + 4 9 = 3 9 Hece the expected umber of turs util the mouse eats pieces of cheese is 4/3 + ( )3/9 (d) Let Y 00 be the umber of turs before the mouse eats 00 pieces of cheese The, Y 00 = 00 i= R i, where R i deotes the umber of turs from eatig the (i )th piece of cheese till eatig the ith piece of cheese The radom variable R i is equal to with probability /3 (trasitio from C to C) ad equal to + S with probability /3 (trasitio from C to T ), where S is a geometric radom variable with parameter 3/4 We have already established that E[R i ] = 3/9 Therefore, E[Ri ] = 3 + E[( + S) ] 3 = 3 + 3 ( + E[S] + E[S ]) = 3 + 3 ( + E[S] + var(s) + (E[S]) ) = 3 + ( + 8 3 3 + /4 (3/4) + 6 ) 9 = 7 7 Hece var(r i ) = 7 7 ( ) 3 = 44 9 8 Sice it is clear that R, R,, R 00 is a iid sequece, it follows by the Cetral Limit Theorem that ( ) ( ) Y 00 E[Y 00 ] 5 00 3/9 P (Y 00 > 5) = P > var(y00 ) 0 44/8 Y 00 E[Y 00 ] 89 = P > var(y00 ) 75 Φ(03) = 08485 = 055 (e) We ow model the problem with a four-state Marov chai The states are C, C, C3, ad N, where C, C, ad C3 represet the states where cheese is foud behid the first, secod, ad third doors, respectively, ad N represets the state where o cheese is foud The trasitio probabilities are { /3 if j i, p CiCj = for all i, j =,, 3, 0 otherwise, p CiN = /3, for all i =,, 3, p NCi = /4, for all i =,, 3,

ad p NN = /4 There are seve three-trasitio paths from state C to state C, which are (i) C C C3 C, (ii) C C N C, (iii) C C3 C C, (iv) C C3 N C, (v) C N C C, (vi) C N C3 C, ad (vii) C N N C Paths (i) ad (iii) occur with probability /7, paths (ii), (iv), (v), ad (vi) occur with probability /36, ad path (vii) occurs with probability /48 Therefore, the probability of goig from state C to state C i three trasitios is 7 + 4 36 + 48 = 89 43