Worked examples Random Processes

Size: px
Start display at page:

Download "Worked examples Random Processes"

Transcription

1 Worked examples Random Processes Example 1 Consider patients coming to a doctor s office at random points in time. Let X n denote the time (in hrs) that the n th patient has to wait before being admitted to see the doctor. (a) Describe the random process X n, n 1. (b) Sketch a typical sample path of X n. Solution (a) The random process X n is a discrete-time, continuous-valued random process. The sample space is S X {x : x 0}. The index parameter set (domain of time) is (b) A sample path: I {1,,, }. Example Let {X n, n 1,, } be a sequence of independent random variables with S Xn {0, 1}, P X n 0], P X n 1] 1 n. Let Y n X i. (a) Find the 1 st order pmf of Y n. (b) Find m Y (n), R Y (n, n + k), C Y (n, n + k). Solution Note than X n is an iid random process and for each fixed n, X n is a Bernoulli random variable with p 1. We have EX n ] p 1 1

2 VARX n ] p(1 p) 9 EX n] VARX n ] + EX n ] 1. (a) For each fixed n, Y n is a binomial random variable with p 1. We obtain P Y n k] C n k ( ) k ( ) n k 1, k 0, 1,, n. (b) m Y (n) EY n ] np n n n+k R Y (n, n + k) EY (n)y (n + k)] E X i n n+k j1 EX i X j ] n n+k EX i ]EX j ] + EXi ] j1 j i n ( n + k ) 9 n n + k + 9 n (n + k + ). 9 j1 X j C Y (n, n + k) R Y (n, n + k) m Y (n)m Y (n + k) n(n + k + ) 9 One can deduce that C Y (n, m) min(n, m) 9. n n + k n 9. Example The number of failures N(t), which occur in a computer network over the time interval 0, t), can be described by a homogeneous Poisson process {N(t), t 0}. On an average, there is a failure after every hours, i.e. the intensity of the process is equal to λ 0.5h 1 ].

3 (a) What is the probability of at most 1 failure in 0, 8), at least failures in 8, 16), and at most 1 failure in 16, ) (time unit: hour)? (b) What is the probability that the third failure occurs after 8 hours? Solution (a) The probability p P N(8) N(0) 1, N(16) N(8), N() N(16) 1] is required. In view of the independence and the homogeneity of the increments of a homogeneous Poisson process, it can be determined as follows: Since p P N(8) N(0) 1]P N(16) N(8) ]P N() N(16) 1] P N(8) 1]P N(8) ]P N(8) 1]. P N(8) 1] P N(8) 0] + P N(8) 1] e e and P N(8) ] 1 P N(8) 1] 0.59, the desired probability is p (b) Let T be the time of arrival of the third failure. Then P T > 8] probability that there are two failures in the first 8 hours ( ) e (0.5 8) i i! i0 ( ) e ! + 5e ! Example Random Telegraph signal Let a random signal X(t) have the structure X(t) ( 1) N(t) Y, t 0,

4 where {N(t), t 0} is a homogeneous Poisson process with intensity λ and Y is a binary random variable with P (Y 1) P (Y 1) 1/ which is independent of N(t) for all t. Signals of this structure are called random telegraph signals. Random telegraph signals are basic modules for generating signals with a more complicated structure. Obviously, X(t) 1 or X(t) 1 and Y determines the sign of X(0). Since X(t) 1 < for all t 0, the stochastic process {X(t), t 0} is a secondorder process. Letting I(t) ( 1) N(t), its trend function is m(t) EX(t)] EY ]EI(t)]. Since EY ] 0, the trend function is identically zero: m(t) 0. It remains to show that the covariance function C(s, t) of this process depends only on t s. This requires the determination of the probability distribution of I(t). A transition from I(t) 1 to I(t) +1 or, conversely, from I(t) +1 to I(t) 1 occurs at those time points where Poisson events occur, i.e. where N(t) jumps. Hence the expected value of I(t) is P (I(t) 1) P (even number of jumps in 0, t]) e λt i0 (λt) i (i)! e λt cosh λt, P (I(t) 1) P (odd number of jumps in 0, t]) e λt i0 (λt) i+1 (i + 1)! e λt sinh λt. EI(t)] 1 P (I(t) 1) + ( 1) P (I(t) 1) e λt cosh λt sinh λt] e λt. Since C(s, t) COVX(s), X(t)] E(X(s)X(t))] EY I(s)Y I(t)] EY I(s)I(t)] E(Y )EI(s)I(t)] and E(Y ) 1, C(s, t) EI(s)I(t)]. Thus, in order to evaluate C(s, t), the joint distribution of the random vector (I(s), I(t)) must be determined. In view of the homogeneity of the increments of {N(t), t 0}, for

5 s < t, p 1,1 P (I(s) 1, I(t) 1) P (I(s) 1)P (I(t) 1 I(s) 1) e λs cosh λs P (even number of jumps in (s, t]) e λs cosh λs e λ(t s) cosh λ(t s) e λt cosh λs cosh λ(t s). Analogously, p 1, 1 P (I(s) 1, I(t) 1) e λt cosh λs sinh λ(t s) p 1,1 P (I(s) 1, I(t) 1) e λt sinh λs sinh λ(t s) p 1, 1 P (I(s) 1, I(t) 1) e λt sinh λs cosh λ(t s). Since EI(s)I(t)] p 1,1 + p 1, 1 p 1, 1 p 1,1, we obtain C(s, t) e λ(t s), s < t. Note that the order of s and t can be changed so that C(s, t) e λ t s. Example 5 A random process is defined by X(t) T + (1 t) where T is a uniform random variable in (0, 1). (a) Find the cdf of X(t). (b) Find m X (t) and C X (t 1, t ). Solution Given that X(t) T + (1 t), where T is uniformly distributed over (0, 1), we then have P X(t) x] P T x (1 t)]; { 0 y < 0 P T y] y 0 < y < 1. 1 y > 1 Write x (1 t) y, then { 0 x < 1 t F X (x) P X(t) x] P T x (1 t)] x (1 t) 1 t < x < t ; 1 x > t 5

6 Note that ET ] 1 and ET ] 1. f X (x) d { 1 1 t < x < t dx F X(x). 0 otherwise t t m X (t) x dx x t. 1 t Alternatively, m X (t) EX(t)] ET + (1 t)] 1 t + ET ] t. Define and observe R X (t 1, t ) E{T + (1 t 1 )}{T + (1 t )}] 1 t ET ] + (1 t t )ET ] + (1 t 1 )(1 t ) C X (t 1, t ) R X (t 1, t ) m X (t 1 )m X (t ) 1 + ( t 1 t ) (1 t 1 )(1 t ) ( ) ( ) t 1 t Example 6 Customers arrive at a service station (service system, queueing system) according to a homogeneous Poisson process {N(t), t 0} with intensity λ. The arrival of a customer is therefore a Poisson event. The number of servers in the system is assumed to be so large that an incoming customer will always find an available server. To cope with this situation the service system must be modeled as having an infinite number of servers. The service times of all customers are assumed to be independent random variables which are identically distributed as B. Let G(y) P (B y) be the distribution function of B and X(t) the random number of customers which gives the state of the system at time t, X(0) 0. The aim is to determine the state probabilities of the system According to the total probability rule, p i (t) P (X(t) i); i 0, 1, ; t 0. p i (t) P (X(t) i N(t) n) P (N(t) n) ni ni P (X(t) i N(t) n) (λt)n e λt. n! 6

7 A customer arriving at time x is with probability 1 G(t x) still in the system at time t, t > x, i.e. the service has not yet been finished by time t. Given N(t) n, the arrival times T 1, T,, T n of the n customers in the system are independent random variables, uniformly distributed over 0, t]. To calculate the state probabilities, the order of the T i is not important. Hence they can be assumed to be independent, unordered random variables, uniformly distributed over 0, t]. Thus, the probability that any one of those n customers is still in the system at time t is p(t) t 0 1 G(t x)] dx t 1 t t 0 1 G(x)] dx. Since the service times are independent of each other, we have ( ) n P X(t) i N(t) n] p(t)] i 1 p(t)] n i ; i 0, 1,, n. i The desired state probabilities can be obtained as follows: ( n p i (t) )p(t)] i 1 p(t)] n i (λt)n e λt i n! ni λtp(t)]i e λt i! k0 λt(1 p(t))] k λtp(t)]i e λt e λt1 p(t)] i! λtp(t)]i i! k! e λtp(t) ; i 0, 1,. Thus, {p i (t); i 0, 1, } is a Poisson distribution with intensity EX(t)] λtp(t). The trend function of the stochastic process {X(t), t 0} is, consequently, m(t) λ t 0 1 G(x)] dx. Let EY ] 1/λ be the expected interarrival time between two successive customers. Since EB] 0 1 G(x)] dx, we obtain EB] lim m(t) t EY ]. Thus, for t, the trend function and the state probabilities of the stochastic process {X(t), t 0} become constant. Letting ρ EB]/EY ], the stationary state probabilities of the system become p i lim t p i (t) ρi i! e ρ ; i 0, 1,. 7

8 In particular, if B is exponentially distributed with parameter µ, then In this case, ρ λ/µ. t m(t) λ e µx dx λ 0 µ (1 e µt ). Example 7 (a) EX (t)] Let X(t) be a Poisson process with parameter λ. Find (b) E{X(t) X(s)} ], for t > s. Solution (a) Note that X(t) is a Poisson random variable and we have EX(t)] VARX(t)] λt, t 0, EX (t)] VARX(t)] + EX(t)] λt + λ t. (b) Since X(t) has stationary increments, so X(t) X(s) and X(t s) have the same distribution, that is, Hence, EX(t) X(s)] EX(t s)] λ(t s) VARX(t) X(s)] VARX(t s)] λ(t s). E{X(t) X(s)} ] VARX(t) X(s)] + EX(t) X(s)] λ(t s) + λ (t s), t > s. Example 8 Patients arrive at the doctor s office according to a Poisson process with rate λ 1 minute. The doctor will not see a patient until at least three patients are in 10 the waiting room. (a) Find the expected wating time until the first patient is admitted to see the doctor. (b) What is the probability that nobody is admitted to see the doctor in the first hour? Solution 8

9 (a) Let Z n inter-event time time between arrival of the n th and (n 1) th patients. Then Z n s are iid exponential random variables with mean 1 λ, i.e., EZ n] 1 λ. Let T n arrival time of the n th patient n Z n. n ] ET n ] E Z n n EZ n ] n λ. The expected waiting time until the first patient is admitted to see the doctor is / 1 ET ] 10 0 minutes. (b) Let X(t) be the Poisson process with mean λt. Note that P X(t) k] (λt)k e λt, k! k 0, 1,. We have P {Nobody is admitted to see the docotr in the 1 st hr}] P {At most patient arrive in first 60 mins}] P X(t) over 0, 60]] P X(60) ] P X(60) 0] + P X(60) 1] + P X(60) ] ( ) 60 e 60/10 + e 60/ ( ) 60 e 60/ e 6 ( ) Example 9 A message requires N time units to be transmitted, when N is a geometric random variable with pmf P N (j) (1 a)a j 1, j 1,,. A single new message arrives during a time unit with probability p and no message arrive with probability 1 p. Let K be the number of new messages that arrive during the transmission of a single message. (a) Find the pmf of K. (b) Use conditional expectation to find EK] and EK ]. 9

10 Solution Using the law of total probabilities, we have P K k] P K k N n]p N (n) n1 Note that P K k N n] 0 for k > n since the number of messages arriving during the transmission time must be less than or equal to n. We have P K k N n] nc k p k (1 p) n k, n k, since this is a binomial experiment with n trials and probability of success in each trial is p. Hence P K k] Using the formula: we then obtain P K k N n]p N (n) nk (1 β) (k+1) P K k] nc k p k (1 p) n k (1 a)a n 1. nk nc k β n k, 0 < β < 1, nk ] k 1 a ap, k 1. a1 a(1 p)] 1 a(1 p) (b) We start with the conditional expectation formulas: EK] EK N n]p N (n) and EK ] n1 EK N n]p N (n). n1 Note that K is a binomial variable with parameters p and n; and so EK N n] np, EK N n] VARK N n] + EK N n] np(1 p) + n p. We obtain and Remark EK] np(1 a)a n 1 n1 EK ] np(1 p) + n p ](1 a)a n 1. n1 10

11 To find the sum, we use the following identities: x+x + x+x +x + d ( ) x dx 1 x x 1 x and 1+ x+ x + x + d dx ; consequently, 1+ x d ( )] x. dx 1 x Example 10 Two gamblers play the following game. A fair coin is flipped; if the outcome is heads, player A pays player B $1, and if the outcome is tails player B pays player A $1. The game is continued until one of the players goes broke. Suppose that initially player A has $1 and player B has $, so a total of $ is up for grabs. Let X n denote the number of dollars held by player A after n trials. a. Show that X n is a Markov chain. b. Sketch the state transition diagram for X n and give the one-step transition probability matrix P. c. Use the state transition diagram to help you show that for n even (i.e., n k), ( ) n 1 p ii (n) for i 1, p 10 (n) ( ) ] k 1 1 p (n). d. Find the n-step transition probability matrix for n even using part c. e. Find the limit of P n as n. Solution f. Find the probability that player A eventually wins. (a) Note that S {0, 1,, } and we need to verify P X n+1 j X n i, X n 1 x n 1,, X 0 x 0 ] P X n+1 j X n i] for all i, j S. Consider the following three cases: i 1, ; i 0 and i. When i 1,, P X n+1 j X n i, X n 1 x n 1,, X 0 x 0 ] P The outcome is H], if j i 1 P The outcome is T ], if j i + 1 0, if j i 1, i , if j i 1 1, if j i + 1 0, if j i 1, i + 1 P X n+1 j X n i]. 11

12 When i 0, P X n+1 j X n 0, X n 1 x n 1,, X 0 x 0 ] { P Xn+1 0 A has gone broke], if j 0 0, if j 0 { 1, if j 0 0, if j 0 P X n+1 j X n 0]. When i, P X n+1 j X n, X n 1 x n 1,, X 0 x 0 ] { P Xn 1 A has won], if j 0, if j { 1, if j 0, if j P X n+1 j X n ]. In summary, X n is a Markov Chain. (b) The transition probability matrix is (c) Note that for n k, which is even, so P The transition 1 followed {X k 1 X 0 1} by transition 1 and this is repeated k n times p 11 (n) (p 1 p 1 )(p 1 p 1 ) (p 1 p 1 ) }{{} k n pairs ( 1 1 ( ) n 1. 1 ) n

13 Similarly, p (n) (p 1 p 1 )(p 1 p 1 ) (p 1 p 1 ) }{{} k n pairs ( ) n 1 State 0 can only change to state 0 in every step during the period (i 1) + 1, n], so {X k 0 X 0 1} n k X (i 1) 1 X 0 1 followed by a one-step transition, where X (i 1)+1 0 X (i 1) 1 Similarly, p 10 (n) P X k 0 X 0 1] k P X (i 1) 1 X 0 1]P X (i 1)+1 0 X (i 1) 1] 1 1 k p 11 ((i 1))p 10 k k ( ) (i 1) 1 ( ) i 1 1 ) k 1 1 ( ( ) ] k 1 1. p (n) P X k X 0 ] k P X (i 1) X 0 ]P X (i 1)+1 X (i 1) ] 1

14 1 k p ((i 1))p k p 10 (n). ( ) (i 1) 1 (d) We compute the remaining n-step transition probabilities p ij (n) row by row for n being even. 1 st row, i 0: Since p 00 (n) 1 as state 0 always changes to state 0 p 0j (n) 1, so p 01 (n) p 0 (n) p 0 (n) 0 j0 nd row, i 1: p 1 (n) 0 as n is even p 1 (n) 1 p ij (n) 1 j0 { 1 ( ) ] k 1 + ( ) k 1 + 0} 1 1 ( ) ] k 1 rd row, i : p 1 (n) 0 as n is even p 0 (n) 1 p j (n) 1 j1 { 0 + ( ) k ( ) ]} k ( ) ] k 1 th row, i : Since p (n) 1 as state always changes to state p j (n) 1 so p 0 (n) p 1 (n) p (n) 0. j0 1

15 In summary, for n k P (n) ( ] 1 k ( 1 ) k 1 ) 0 1 ( ) ] 1 k 1 ( ) ] 1 k ( 0 1 ) k 1 ( ) ] 1 k (e) For n k + 1, that is, n is odd, 1 P (n) P (k + 1) P k P ( ) ] 1 k ( ) k ( ) k 1 1 ( ) ] 1 k P (n) 1 1 ( ) ] 1 k ( 1 1 ) k 0 1 ( ) ] 1 k ( k ) Note that ( ) k 1 0 as k, we have lim P (k) k lim P (k + 1), k so (f) Note that lim P (n) n lim p 1(k) lim p 1 1(k + 1) lim 1 k k k ( ) ] k 1 1 so lim n p 1(n) 1. P Player A eventually wins] lim P X n X 0 1] n lim P 1(n) n 1. 15

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

The Exponential Distribution

The Exponential Distribution 21 The Exponential Distribution From Discrete-Time to Continuous-Time: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding

More information

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS There are four questions, each with several parts. 1. Customers Coming to an Automatic Teller Machine (ATM) (30 points)

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process LECTURE 16 Readings: Section 5.1 Lecture outline Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process Number of successes Distribution of interarrival times The

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Binomial lattice model for stock prices

Binomial lattice model for stock prices Copyright c 2007 by Karl Sigman Binomial lattice model for stock prices Here we model the price of a stock in discrete time by a Markov chain of the recursive form S n+ S n Y n+, n 0, where the {Y i }

More information

M/M/1 and M/M/m Queueing Systems

M/M/1 and M/M/m Queueing Systems M/M/ and M/M/m Queueing Systems M. Veeraraghavan; March 20, 2004. Preliminaries. Kendall s notation: G/G/n/k queue G: General - can be any distribution. First letter: Arrival process; M: memoryless - exponential

More information

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Practice Problems #4

Practice Problems #4 Practice Problems #4 PRACTICE PROBLEMS FOR HOMEWORK 4 (1) Read section 2.5 of the text. (2) Solve the practice problems below. (3) Open Homework Assignment #4, solve the problems, and submit multiple-choice

More information

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as

More information

Master s Theory Exam Spring 2006

Master s Theory Exam Spring 2006 Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

More information

Lectures on Stochastic Processes. William G. Faris

Lectures on Stochastic Processes. William G. Faris Lectures on Stochastic Processes William G. Faris November 8, 2001 2 Contents 1 Random walk 7 1.1 Symmetric simple random walk................... 7 1.2 Simple random walk......................... 9 1.3

More information

3.2 Roulette and Markov Chains

3.2 Roulette and Markov Chains 238 CHAPTER 3. DISCRETE DYNAMICAL SYSTEMS WITH MANY VARIABLES 3.2 Roulette and Markov Chains In this section we will be discussing an application of systems of recursion equations called Markov Chains.

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Lecture 13: Martingales

Lecture 13: Martingales Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

Exponential Distribution

Exponential Distribution Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1

More information

Tenth Problem Assignment

Tenth Problem Assignment EECS 40 Due on April 6, 007 PROBLEM (8 points) Dave is taking a multiple-choice exam. You may assume that the number of questions is infinite. Simultaneously, but independently, his conscious and subconscious

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

Probability and Random Variables. Generation of random variables (r.v.)

Probability and Random Variables. Generation of random variables (r.v.) Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly

More information

Notes on Probability Theory

Notes on Probability Theory Notes on Probability Theory Christopher King Department of Mathematics Northeastern University July 31, 2009 Abstract These notes are intended to give a solid introduction to Probability Theory with a

More information

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100 Wald s Identity by Jeffery Hein Dartmouth College, Math 100 1. Introduction Given random variables X 1, X 2, X 3,... with common finite mean and a stopping rule τ which may depend upon the given sequence,

More information

Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about

More information

LECTURE 4. Last time: Lecture outline

LECTURE 4. Last time: Lecture outline LECTURE 4 Last time: Types of convergence Weak Law of Large Numbers Strong Law of Large Numbers Asymptotic Equipartition Property Lecture outline Stochastic processes Markov chains Entropy rate Random

More information

WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM

WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM EXAMPLE 1. A biased coin (with probability of obtaining a Head equal to p > 0) is tossed repeatedly and independently until the first head is observed.

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Practice Problems for Homework #8. Markov Chains. Read Sections 7.1-7.3. Solve the practice problems below.

Practice Problems for Homework #8. Markov Chains. Read Sections 7.1-7.3. Solve the practice problems below. Practice Problems for Homework #8. Markov Chains. Read Sections 7.1-7.3 Solve the practice problems below. Open Homework Assignment #8 and solve the problems. 1. (10 marks) A computer system can operate

More information

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random

More information

Queueing Systems. Ivo Adan and Jacques Resing

Queueing Systems. Ivo Adan and Jacques Resing Queueing Systems Ivo Adan and Jacques Resing Department of Mathematics and Computing Science Eindhoven University of Technology P.O. Box 513, 5600 MB Eindhoven, The Netherlands March 26, 2015 Contents

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem Coyright c 2009 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

Fourth Problem Assignment

Fourth Problem Assignment EECS 401 Due on Feb 2, 2007 PROBLEM 1 (25 points) Joe and Helen each know that the a priori probability that her mother will be home on any given night is 0.6. However, Helen can determine her mother s

More information

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Solved Problems. Chapter 14. 14.1 Probability review

Solved Problems. Chapter 14. 14.1 Probability review Chapter 4 Solved Problems 4. Probability review Problem 4.. Let X and Y be two N 0 -valued random variables such that X = Y + Z, where Z is a Bernoulli random variable with parameter p (0, ), independent

More information

On the mathematical theory of splitting and Russian roulette

On the mathematical theory of splitting and Russian roulette On the mathematical theory of splitting and Russian roulette techniques St.Petersburg State University, Russia 1. Introduction Splitting is an universal and potentially very powerful technique for increasing

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Homework set 4 - Solutions

Homework set 4 - Solutions Homework set 4 - Solutions Math 495 Renato Feres Problems R for continuous time Markov chains The sequence of random variables of a Markov chain may represent the states of a random system recorded at

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

The Joint Distribution of Server State and Queue Length of M/M/1/1 Retrial Queue with Abandonment and Feedback

The Joint Distribution of Server State and Queue Length of M/M/1/1 Retrial Queue with Abandonment and Feedback The Joint Distribution of Server State and Queue Length of M/M/1/1 Retrial Queue with Abandonment and Feedback Hamada Alshaer Université Pierre et Marie Curie - Lip 6 7515 Paris, France Hamada.alshaer@lip6.fr

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION. School of Mathematical Sciences. Monash University, Clayton, Victoria, Australia 3168

SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION. School of Mathematical Sciences. Monash University, Clayton, Victoria, Australia 3168 SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION Ravi PHATARFOD School of Mathematical Sciences Monash University, Clayton, Victoria, Australia 3168 In this paper we consider the problem of gambling with

More information

2WB05 Simulation Lecture 8: Generating random variables

2WB05 Simulation Lecture 8: Generating random variables 2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating

More information

UNIT 2 QUEUING THEORY

UNIT 2 QUEUING THEORY UNIT 2 QUEUING THEORY LESSON 24 Learning Objective: Apply formulae to find solution that will predict the behaviour of the single server model II. Apply formulae to find solution that will predict the

More information

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions Math 70/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 2 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,

More information

Queuing Theory. Long Term Averages. Assumptions. Interesting Values. Queuing Model

Queuing Theory. Long Term Averages. Assumptions. Interesting Values. Queuing Model Queuing Theory Queuing Theory Queuing theory is the mathematics of waiting lines. It is extremely useful in predicting and evaluating system performance. Queuing theory has been used for operations research.

More information

Random access protocols for channel access. Markov chains and their stability. Laurent Massoulié.

Random access protocols for channel access. Markov chains and their stability. Laurent Massoulié. Random access protocols for channel access Markov chains and their stability laurent.massoulie@inria.fr Aloha: the first random access protocol for channel access [Abramson, Hawaii 70] Goal: allow machines

More information

Statistics 100A Homework 3 Solutions

Statistics 100A Homework 3 Solutions Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win $ for each black ball selected and we

More information

9.2 Summation Notation

9.2 Summation Notation 9. Summation Notation 66 9. Summation Notation In the previous section, we introduced sequences and now we shall present notation and theorems concerning the sum of terms of a sequence. We begin with a

More information

THE CENTRAL LIMIT THEOREM TORONTO

THE CENTRAL LIMIT THEOREM TORONTO THE CENTRAL LIMIT THEOREM DANIEL RÜDT UNIVERSITY OF TORONTO MARCH, 2010 Contents 1 Introduction 1 2 Mathematical Background 3 3 The Central Limit Theorem 4 4 Examples 4 4.1 Roulette......................................

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k. REPEATED TRIALS Suppose you toss a fair coin one time. Let E be the event that the coin lands heads. We know from basic counting that p(e) = 1 since n(e) = 1 and 2 n(s) = 2. Now suppose we play a game

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

ISyE 6761 Fall 2012 Homework #2 Solutions

ISyE 6761 Fall 2012 Homework #2 Solutions 1 1. The joint p.m.f. of X and Y is (a) Find E[X Y ] for 1, 2, 3. (b) Find E[E[X Y ]]. (c) Are X and Y independent? ISE 6761 Fall 212 Homework #2 Solutions f(x, ) x 1 x 2 x 3 1 1/9 1/3 1/9 2 1/9 1/18 3

More information

2. Discrete random variables

2. Discrete random variables 2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

More information

Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations

Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations 56 Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations Florin-Cătălin ENACHE

More information

Optimal Hiring of Cloud Servers A. Stephen McGough, Isi Mitrani. EPEW 2014, Florence

Optimal Hiring of Cloud Servers A. Stephen McGough, Isi Mitrani. EPEW 2014, Florence Optimal Hiring of Cloud Servers A. Stephen McGough, Isi Mitrani EPEW 2014, Florence Scenario How many cloud instances should be hired? Requests Host hiring servers The number of active servers is controlled

More information

Ex. 2.1 (Davide Basilio Bartolini)

Ex. 2.1 (Davide Basilio Bartolini) ECE 54: Elements of Information Theory, Fall 00 Homework Solutions Ex.. (Davide Basilio Bartolini) Text Coin Flips. A fair coin is flipped until the first head occurs. Let X denote the number of flips

More information

A Tutorial on Probability Theory

A Tutorial on Probability Theory Paola Sebastiani Department of Mathematics and Statistics University of Massachusetts at Amherst Corresponding Author: Paola Sebastiani. Department of Mathematics and Statistics, University of Massachusetts,

More information

Math 120 Final Exam Practice Problems, Form: A

Math 120 Final Exam Practice Problems, Form: A Math 120 Final Exam Practice Problems, Form: A Name: While every attempt was made to be complete in the types of problems given below, we make no guarantees about the completeness of the problems. Specifically,

More information

Math/Stats 342: Solutions to Homework

Math/Stats 342: Solutions to Homework Math/Stats 342: Solutions to Homework Steven Miller (sjm1@williams.edu) November 17, 2011 Abstract Below are solutions / sketches of solutions to the homework problems from Math/Stats 342: Probability

More information

Markov Chains. Chapter 4. 4.1 Stochastic Processes

Markov Chains. Chapter 4. 4.1 Stochastic Processes Chapter 4 Markov Chains 4 Stochastic Processes Often, we need to estimate probabilities using a collection of random variables For example, an actuary may be interested in estimating the probability that

More information

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION No: CITY UNIVERSITY LONDON BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2005 Date: August

More information

Stats on the TI 83 and TI 84 Calculator

Stats on the TI 83 and TI 84 Calculator Stats on the TI 83 and TI 84 Calculator Entering the sample values STAT button Left bracket { Right bracket } Store (STO) List L1 Comma Enter Example: Sample data are {5, 10, 15, 20} 1. Press 2 ND and

More information

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) =

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) = . A mail-order computer business has si telephone lines. Let X denote the number of lines in use at a specified time. Suppose the pmf of X is as given in the accompanying table. 0 2 3 4 5 6 p(.0.5.20.25.20.06.04

More information

Stochastic Processes and Advanced Mathematical Finance. Duration of the Gambler s Ruin

Stochastic Processes and Advanced Mathematical Finance. Duration of the Gambler s Ruin Steven R. Dunbar Department of Mathematics 203 Avery Hall University of Nebraska-Lincoln Lincoln, NE 68588-0130 http://www.math.unl.edu Voice: 402-472-3731 Fax: 402-472-8466 Stochastic Processes and Advanced

More information

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions Math 370, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the

More information

EXAM 3, FALL 003 Please note: On a one-time basis, the CAS is releasing annotated solutions to Fall 003 Examination 3 as a study aid to candidates. It is anticipated that for future sittings, only the

More information

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22 Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability

More information

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit? ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

More information

Hydrodynamic Limits of Randomized Load Balancing Networks

Hydrodynamic Limits of Randomized Load Balancing Networks Hydrodynamic Limits of Randomized Load Balancing Networks Kavita Ramanan and Mohammadreza Aghajani Brown University Stochastic Networks and Stochastic Geometry a conference in honour of François Baccelli

More information

Math 202-0 Quizzes Winter 2009

Math 202-0 Quizzes Winter 2009 Quiz : Basic Probability Ten Scrabble tiles are placed in a bag Four of the tiles have the letter printed on them, and there are two tiles each with the letters B, C and D on them (a) Suppose one tile

More information

Representation of functions as power series

Representation of functions as power series Representation of functions as power series Dr. Philippe B. Laval Kennesaw State University November 9, 008 Abstract This document is a summary of the theory and techniques used to represent functions

More information

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you

More information

Simple Markovian Queueing Systems

Simple Markovian Queueing Systems Chapter 4 Simple Markovian Queueing Systems Poisson arrivals and exponential service make queueing models Markovian that are easy to analyze and get usable results. Historically, these are also the models

More information

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

More information

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008 Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the

More information

MATH 4330/5330, Fourier Analysis Section 11, The Discrete Fourier Transform

MATH 4330/5330, Fourier Analysis Section 11, The Discrete Fourier Transform MATH 433/533, Fourier Analysis Section 11, The Discrete Fourier Transform Now, instead of considering functions defined on a continuous domain, like the interval [, 1) or the whole real line R, we wish

More information

Principle of Data Reduction

Principle of Data Reduction Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then

More information

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Discrete vs. continuous random variables Examples of continuous distributions o Uniform o Exponential o Normal Recall: A random

More information

6.263/16.37: Lectures 5 & 6 Introduction to Queueing Theory

6.263/16.37: Lectures 5 & 6 Introduction to Queueing Theory 6.263/16.37: Lectures 5 & 6 Introduction to Queueing Theory Massachusetts Institute of Technology Slide 1 Packet Switched Networks Messages broken into Packets that are routed To their destination PS PS

More information

Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan. 15.450, Fall 2010

Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan. 15.450, Fall 2010 Simulation Methods Leonid Kogan MIT, Sloan 15.450, Fall 2010 c Leonid Kogan ( MIT, Sloan ) Simulation Methods 15.450, Fall 2010 1 / 35 Outline 1 Generating Random Numbers 2 Variance Reduction 3 Quasi-Monte

More information