Worked examples Random Processes

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Worked examples Random Processes"

Transcription

1 Worked examples Random Processes Example 1 Consider patients coming to a doctor s office at random points in time. Let X n denote the time (in hrs) that the n th patient has to wait before being admitted to see the doctor. (a) Describe the random process X n, n 1. (b) Sketch a typical sample path of X n. Solution (a) The random process X n is a discrete-time, continuous-valued random process. The sample space is S X {x : x 0}. The index parameter set (domain of time) is (b) A sample path: I {1,,, }. Example Let {X n, n 1,, } be a sequence of independent random variables with S Xn {0, 1}, P X n 0], P X n 1] 1 n. Let Y n X i. (a) Find the 1 st order pmf of Y n. (b) Find m Y (n), R Y (n, n + k), C Y (n, n + k). Solution Note than X n is an iid random process and for each fixed n, X n is a Bernoulli random variable with p 1. We have EX n ] p 1 1

2 VARX n ] p(1 p) 9 EX n] VARX n ] + EX n ] 1. (a) For each fixed n, Y n is a binomial random variable with p 1. We obtain P Y n k] C n k ( ) k ( ) n k 1, k 0, 1,, n. (b) m Y (n) EY n ] np n n n+k R Y (n, n + k) EY (n)y (n + k)] E X i n n+k j1 EX i X j ] n n+k EX i ]EX j ] + EXi ] j1 j i n ( n + k ) 9 n n + k + 9 n (n + k + ). 9 j1 X j C Y (n, n + k) R Y (n, n + k) m Y (n)m Y (n + k) n(n + k + ) 9 One can deduce that C Y (n, m) min(n, m) 9. n n + k n 9. Example The number of failures N(t), which occur in a computer network over the time interval 0, t), can be described by a homogeneous Poisson process {N(t), t 0}. On an average, there is a failure after every hours, i.e. the intensity of the process is equal to λ 0.5h 1 ].

3 (a) What is the probability of at most 1 failure in 0, 8), at least failures in 8, 16), and at most 1 failure in 16, ) (time unit: hour)? (b) What is the probability that the third failure occurs after 8 hours? Solution (a) The probability p P N(8) N(0) 1, N(16) N(8), N() N(16) 1] is required. In view of the independence and the homogeneity of the increments of a homogeneous Poisson process, it can be determined as follows: Since p P N(8) N(0) 1]P N(16) N(8) ]P N() N(16) 1] P N(8) 1]P N(8) ]P N(8) 1]. P N(8) 1] P N(8) 0] + P N(8) 1] e e and P N(8) ] 1 P N(8) 1] 0.59, the desired probability is p (b) Let T be the time of arrival of the third failure. Then P T > 8] probability that there are two failures in the first 8 hours ( ) e (0.5 8) i i! i0 ( ) e ! + 5e ! Example Random Telegraph signal Let a random signal X(t) have the structure X(t) ( 1) N(t) Y, t 0,

4 where {N(t), t 0} is a homogeneous Poisson process with intensity λ and Y is a binary random variable with P (Y 1) P (Y 1) 1/ which is independent of N(t) for all t. Signals of this structure are called random telegraph signals. Random telegraph signals are basic modules for generating signals with a more complicated structure. Obviously, X(t) 1 or X(t) 1 and Y determines the sign of X(0). Since X(t) 1 < for all t 0, the stochastic process {X(t), t 0} is a secondorder process. Letting I(t) ( 1) N(t), its trend function is m(t) EX(t)] EY ]EI(t)]. Since EY ] 0, the trend function is identically zero: m(t) 0. It remains to show that the covariance function C(s, t) of this process depends only on t s. This requires the determination of the probability distribution of I(t). A transition from I(t) 1 to I(t) +1 or, conversely, from I(t) +1 to I(t) 1 occurs at those time points where Poisson events occur, i.e. where N(t) jumps. Hence the expected value of I(t) is P (I(t) 1) P (even number of jumps in 0, t]) e λt i0 (λt) i (i)! e λt cosh λt, P (I(t) 1) P (odd number of jumps in 0, t]) e λt i0 (λt) i+1 (i + 1)! e λt sinh λt. EI(t)] 1 P (I(t) 1) + ( 1) P (I(t) 1) e λt cosh λt sinh λt] e λt. Since C(s, t) COVX(s), X(t)] E(X(s)X(t))] EY I(s)Y I(t)] EY I(s)I(t)] E(Y )EI(s)I(t)] and E(Y ) 1, C(s, t) EI(s)I(t)]. Thus, in order to evaluate C(s, t), the joint distribution of the random vector (I(s), I(t)) must be determined. In view of the homogeneity of the increments of {N(t), t 0}, for

5 s < t, p 1,1 P (I(s) 1, I(t) 1) P (I(s) 1)P (I(t) 1 I(s) 1) e λs cosh λs P (even number of jumps in (s, t]) e λs cosh λs e λ(t s) cosh λ(t s) e λt cosh λs cosh λ(t s). Analogously, p 1, 1 P (I(s) 1, I(t) 1) e λt cosh λs sinh λ(t s) p 1,1 P (I(s) 1, I(t) 1) e λt sinh λs sinh λ(t s) p 1, 1 P (I(s) 1, I(t) 1) e λt sinh λs cosh λ(t s). Since EI(s)I(t)] p 1,1 + p 1, 1 p 1, 1 p 1,1, we obtain C(s, t) e λ(t s), s < t. Note that the order of s and t can be changed so that C(s, t) e λ t s. Example 5 A random process is defined by X(t) T + (1 t) where T is a uniform random variable in (0, 1). (a) Find the cdf of X(t). (b) Find m X (t) and C X (t 1, t ). Solution Given that X(t) T + (1 t), where T is uniformly distributed over (0, 1), we then have P X(t) x] P T x (1 t)]; { 0 y < 0 P T y] y 0 < y < 1. 1 y > 1 Write x (1 t) y, then { 0 x < 1 t F X (x) P X(t) x] P T x (1 t)] x (1 t) 1 t < x < t ; 1 x > t 5

6 Note that ET ] 1 and ET ] 1. f X (x) d { 1 1 t < x < t dx F X(x). 0 otherwise t t m X (t) x dx x t. 1 t Alternatively, m X (t) EX(t)] ET + (1 t)] 1 t + ET ] t. Define and observe R X (t 1, t ) E{T + (1 t 1 )}{T + (1 t )}] 1 t ET ] + (1 t t )ET ] + (1 t 1 )(1 t ) C X (t 1, t ) R X (t 1, t ) m X (t 1 )m X (t ) 1 + ( t 1 t ) (1 t 1 )(1 t ) ( ) ( ) t 1 t Example 6 Customers arrive at a service station (service system, queueing system) according to a homogeneous Poisson process {N(t), t 0} with intensity λ. The arrival of a customer is therefore a Poisson event. The number of servers in the system is assumed to be so large that an incoming customer will always find an available server. To cope with this situation the service system must be modeled as having an infinite number of servers. The service times of all customers are assumed to be independent random variables which are identically distributed as B. Let G(y) P (B y) be the distribution function of B and X(t) the random number of customers which gives the state of the system at time t, X(0) 0. The aim is to determine the state probabilities of the system According to the total probability rule, p i (t) P (X(t) i); i 0, 1, ; t 0. p i (t) P (X(t) i N(t) n) P (N(t) n) ni ni P (X(t) i N(t) n) (λt)n e λt. n! 6

7 A customer arriving at time x is with probability 1 G(t x) still in the system at time t, t > x, i.e. the service has not yet been finished by time t. Given N(t) n, the arrival times T 1, T,, T n of the n customers in the system are independent random variables, uniformly distributed over 0, t]. To calculate the state probabilities, the order of the T i is not important. Hence they can be assumed to be independent, unordered random variables, uniformly distributed over 0, t]. Thus, the probability that any one of those n customers is still in the system at time t is p(t) t 0 1 G(t x)] dx t 1 t t 0 1 G(x)] dx. Since the service times are independent of each other, we have ( ) n P X(t) i N(t) n] p(t)] i 1 p(t)] n i ; i 0, 1,, n. i The desired state probabilities can be obtained as follows: ( n p i (t) )p(t)] i 1 p(t)] n i (λt)n e λt i n! ni λtp(t)]i e λt i! k0 λt(1 p(t))] k λtp(t)]i e λt e λt1 p(t)] i! λtp(t)]i i! k! e λtp(t) ; i 0, 1,. Thus, {p i (t); i 0, 1, } is a Poisson distribution with intensity EX(t)] λtp(t). The trend function of the stochastic process {X(t), t 0} is, consequently, m(t) λ t 0 1 G(x)] dx. Let EY ] 1/λ be the expected interarrival time between two successive customers. Since EB] 0 1 G(x)] dx, we obtain EB] lim m(t) t EY ]. Thus, for t, the trend function and the state probabilities of the stochastic process {X(t), t 0} become constant. Letting ρ EB]/EY ], the stationary state probabilities of the system become p i lim t p i (t) ρi i! e ρ ; i 0, 1,. 7

8 In particular, if B is exponentially distributed with parameter µ, then In this case, ρ λ/µ. t m(t) λ e µx dx λ 0 µ (1 e µt ). Example 7 (a) EX (t)] Let X(t) be a Poisson process with parameter λ. Find (b) E{X(t) X(s)} ], for t > s. Solution (a) Note that X(t) is a Poisson random variable and we have EX(t)] VARX(t)] λt, t 0, EX (t)] VARX(t)] + EX(t)] λt + λ t. (b) Since X(t) has stationary increments, so X(t) X(s) and X(t s) have the same distribution, that is, Hence, EX(t) X(s)] EX(t s)] λ(t s) VARX(t) X(s)] VARX(t s)] λ(t s). E{X(t) X(s)} ] VARX(t) X(s)] + EX(t) X(s)] λ(t s) + λ (t s), t > s. Example 8 Patients arrive at the doctor s office according to a Poisson process with rate λ 1 minute. The doctor will not see a patient until at least three patients are in 10 the waiting room. (a) Find the expected wating time until the first patient is admitted to see the doctor. (b) What is the probability that nobody is admitted to see the doctor in the first hour? Solution 8

9 (a) Let Z n inter-event time time between arrival of the n th and (n 1) th patients. Then Z n s are iid exponential random variables with mean 1 λ, i.e., EZ n] 1 λ. Let T n arrival time of the n th patient n Z n. n ] ET n ] E Z n n EZ n ] n λ. The expected waiting time until the first patient is admitted to see the doctor is / 1 ET ] 10 0 minutes. (b) Let X(t) be the Poisson process with mean λt. Note that P X(t) k] (λt)k e λt, k! k 0, 1,. We have P {Nobody is admitted to see the docotr in the 1 st hr}] P {At most patient arrive in first 60 mins}] P X(t) over 0, 60]] P X(60) ] P X(60) 0] + P X(60) 1] + P X(60) ] ( ) 60 e 60/10 + e 60/ ( ) 60 e 60/ e 6 ( ) Example 9 A message requires N time units to be transmitted, when N is a geometric random variable with pmf P N (j) (1 a)a j 1, j 1,,. A single new message arrives during a time unit with probability p and no message arrive with probability 1 p. Let K be the number of new messages that arrive during the transmission of a single message. (a) Find the pmf of K. (b) Use conditional expectation to find EK] and EK ]. 9

10 Solution Using the law of total probabilities, we have P K k] P K k N n]p N (n) n1 Note that P K k N n] 0 for k > n since the number of messages arriving during the transmission time must be less than or equal to n. We have P K k N n] nc k p k (1 p) n k, n k, since this is a binomial experiment with n trials and probability of success in each trial is p. Hence P K k] Using the formula: we then obtain P K k N n]p N (n) nk (1 β) (k+1) P K k] nc k p k (1 p) n k (1 a)a n 1. nk nc k β n k, 0 < β < 1, nk ] k 1 a ap, k 1. a1 a(1 p)] 1 a(1 p) (b) We start with the conditional expectation formulas: EK] EK N n]p N (n) and EK ] n1 EK N n]p N (n). n1 Note that K is a binomial variable with parameters p and n; and so EK N n] np, EK N n] VARK N n] + EK N n] np(1 p) + n p. We obtain and Remark EK] np(1 a)a n 1 n1 EK ] np(1 p) + n p ](1 a)a n 1. n1 10

11 To find the sum, we use the following identities: x+x + x+x +x + d ( ) x dx 1 x x 1 x and 1+ x+ x + x + d dx ; consequently, 1+ x d ( )] x. dx 1 x Example 10 Two gamblers play the following game. A fair coin is flipped; if the outcome is heads, player A pays player B $1, and if the outcome is tails player B pays player A $1. The game is continued until one of the players goes broke. Suppose that initially player A has $1 and player B has $, so a total of $ is up for grabs. Let X n denote the number of dollars held by player A after n trials. a. Show that X n is a Markov chain. b. Sketch the state transition diagram for X n and give the one-step transition probability matrix P. c. Use the state transition diagram to help you show that for n even (i.e., n k), ( ) n 1 p ii (n) for i 1, p 10 (n) ( ) ] k 1 1 p (n). d. Find the n-step transition probability matrix for n even using part c. e. Find the limit of P n as n. Solution f. Find the probability that player A eventually wins. (a) Note that S {0, 1,, } and we need to verify P X n+1 j X n i, X n 1 x n 1,, X 0 x 0 ] P X n+1 j X n i] for all i, j S. Consider the following three cases: i 1, ; i 0 and i. When i 1,, P X n+1 j X n i, X n 1 x n 1,, X 0 x 0 ] P The outcome is H], if j i 1 P The outcome is T ], if j i + 1 0, if j i 1, i , if j i 1 1, if j i + 1 0, if j i 1, i + 1 P X n+1 j X n i]. 11

12 When i 0, P X n+1 j X n 0, X n 1 x n 1,, X 0 x 0 ] { P Xn+1 0 A has gone broke], if j 0 0, if j 0 { 1, if j 0 0, if j 0 P X n+1 j X n 0]. When i, P X n+1 j X n, X n 1 x n 1,, X 0 x 0 ] { P Xn 1 A has won], if j 0, if j { 1, if j 0, if j P X n+1 j X n ]. In summary, X n is a Markov Chain. (b) The transition probability matrix is (c) Note that for n k, which is even, so P The transition 1 followed {X k 1 X 0 1} by transition 1 and this is repeated k n times p 11 (n) (p 1 p 1 )(p 1 p 1 ) (p 1 p 1 ) }{{} k n pairs ( 1 1 ( ) n 1. 1 ) n

13 Similarly, p (n) (p 1 p 1 )(p 1 p 1 ) (p 1 p 1 ) }{{} k n pairs ( ) n 1 State 0 can only change to state 0 in every step during the period (i 1) + 1, n], so {X k 0 X 0 1} n k X (i 1) 1 X 0 1 followed by a one-step transition, where X (i 1)+1 0 X (i 1) 1 Similarly, p 10 (n) P X k 0 X 0 1] k P X (i 1) 1 X 0 1]P X (i 1)+1 0 X (i 1) 1] 1 1 k p 11 ((i 1))p 10 k k ( ) (i 1) 1 ( ) i 1 1 ) k 1 1 ( ( ) ] k 1 1. p (n) P X k X 0 ] k P X (i 1) X 0 ]P X (i 1)+1 X (i 1) ] 1

14 1 k p ((i 1))p k p 10 (n). ( ) (i 1) 1 (d) We compute the remaining n-step transition probabilities p ij (n) row by row for n being even. 1 st row, i 0: Since p 00 (n) 1 as state 0 always changes to state 0 p 0j (n) 1, so p 01 (n) p 0 (n) p 0 (n) 0 j0 nd row, i 1: p 1 (n) 0 as n is even p 1 (n) 1 p ij (n) 1 j0 { 1 ( ) ] k 1 + ( ) k 1 + 0} 1 1 ( ) ] k 1 rd row, i : p 1 (n) 0 as n is even p 0 (n) 1 p j (n) 1 j1 { 0 + ( ) k ( ) ]} k ( ) ] k 1 th row, i : Since p (n) 1 as state always changes to state p j (n) 1 so p 0 (n) p 1 (n) p (n) 0. j0 1

15 In summary, for n k P (n) ( ] 1 k ( 1 ) k 1 ) 0 1 ( ) ] 1 k 1 ( ) ] 1 k ( 0 1 ) k 1 ( ) ] 1 k (e) For n k + 1, that is, n is odd, 1 P (n) P (k + 1) P k P ( ) ] 1 k ( ) k ( ) k 1 1 ( ) ] 1 k P (n) 1 1 ( ) ] 1 k ( 1 1 ) k 0 1 ( ) ] 1 k ( k ) Note that ( ) k 1 0 as k, we have lim P (k) k lim P (k + 1), k so (f) Note that lim P (n) n lim p 1(k) lim p 1 1(k + 1) lim 1 k k k ( ) ] k 1 1 so lim n p 1(n) 1. P Player A eventually wins] lim P X n X 0 1] n lim P 1(n) n 1. 15

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

The Exponential Distribution

The Exponential Distribution 21 The Exponential Distribution From Discrete-Time to Continuous-Time: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding

More information

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS There are four questions, each with several parts. 1. Customers Coming to an Automatic Teller Machine (ATM) (30 points)

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1 ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process LECTURE 16 Readings: Section 5.1 Lecture outline Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process Number of successes Distribution of interarrival times The

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

More information

Exercises with solutions (1)

Exercises with solutions (1) Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal

More information

Chapter 4. Multivariate Distributions

Chapter 4. Multivariate Distributions 1 Chapter 4. Multivariate Distributions Joint p.m.f. (p.d.f.) Independent Random Variables Covariance and Correlation Coefficient Expectation and Covariance Matrix Multivariate (Normal) Distributions Matlab

More information

RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 )

RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 ) RANDOM VARIABLES MATH CIRCLE (ADVANCED) //0 0) a) Suppose you flip a fair coin times. i) What is the probability you get 0 heads? ii) head? iii) heads? iv) heads? For = 0,,,, P ( Heads) = ( ) b) Suppose

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

More information

M/M/1 and M/M/m Queueing Systems

M/M/1 and M/M/m Queueing Systems M/M/ and M/M/m Queueing Systems M. Veeraraghavan; March 20, 2004. Preliminaries. Kendall s notation: G/G/n/k queue G: General - can be any distribution. First letter: Arrival process; M: memoryless - exponential

More information

Binomial lattice model for stock prices

Binomial lattice model for stock prices Copyright c 2007 by Karl Sigman Binomial lattice model for stock prices Here we model the price of a stock in discrete time by a Markov chain of the recursive form S n+ S n Y n+, n 0, where the {Y i }

More information

Master s Theory Exam Spring 2006

Master s Theory Exam Spring 2006 Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

More information

ST 371 (VIII): Theory of Joint Distributions

ST 371 (VIII): Theory of Joint Distributions ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or

More information

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Practice Problems #4

Practice Problems #4 Practice Problems #4 PRACTICE PROBLEMS FOR HOMEWORK 4 (1) Read section 2.5 of the text. (2) Solve the practice problems below. (3) Open Homework Assignment #4, solve the problems, and submit multiple-choice

More information

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as

More information

3.2 Roulette and Markov Chains

3.2 Roulette and Markov Chains 238 CHAPTER 3. DISCRETE DYNAMICAL SYSTEMS WITH MANY VARIABLES 3.2 Roulette and Markov Chains In this section we will be discussing an application of systems of recursion equations called Markov Chains.

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

Lectures on Stochastic Processes. William G. Faris

Lectures on Stochastic Processes. William G. Faris Lectures on Stochastic Processes William G. Faris November 8, 2001 2 Contents 1 Random walk 7 1.1 Symmetric simple random walk................... 7 1.2 Simple random walk......................... 9 1.3

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Lecture 13: Martingales

Lecture 13: Martingales Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of

More information

Exponential Distribution

Exponential Distribution Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1

More information

Probability and Statistics

Probability and Statistics CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b - 0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute - Systems and Modeling GIGA - Bioinformatics ULg kristel.vansteen@ulg.ac.be

More information

Tenth Problem Assignment

Tenth Problem Assignment EECS 40 Due on April 6, 007 PROBLEM (8 points) Dave is taking a multiple-choice exam. You may assume that the number of questions is infinite. Simultaneously, but independently, his conscious and subconscious

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Probability and Random Variables. Generation of random variables (r.v.)

Probability and Random Variables. Generation of random variables (r.v.) Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

Lectures 5 & / Introduction to Queueing Theory

Lectures 5 & / Introduction to Queueing Theory Lectures 5 & 6 6.263/16.37 Introduction to Queueing Theory MIT, LIDS Slide 1 Packet Switched Networks Messages broken into Packets that are routed To their destination PS PS PS PS Packet Network PS PS

More information

Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables. Multiple random variables Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM

WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM EXAMPLE 1. A biased coin (with probability of obtaining a Head equal to p > 0) is tossed repeatedly and independently until the first head is observed.

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

Notes on Probability Theory

Notes on Probability Theory Notes on Probability Theory Christopher King Department of Mathematics Northeastern University July 31, 2009 Abstract These notes are intended to give a solid introduction to Probability Theory with a

More information

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100 Wald s Identity by Jeffery Hein Dartmouth College, Math 100 1. Introduction Given random variables X 1, X 2, X 3,... with common finite mean and a stopping rule τ which may depend upon the given sequence,

More information

Lesson 5 Chapter 4: Jointly Distributed Random Variables

Lesson 5 Chapter 4: Jointly Distributed Random Variables Lesson 5 Chapter 4: Jointly Distributed Random Variables Department of Statistics The Pennsylvania State University 1 Marginal and Conditional Probability Mass Functions The Regression Function Independence

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

More information

Bivariate Distributions

Bivariate Distributions Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is

More information

The Method of Lagrange Multipliers

The Method of Lagrange Multipliers The Method of Lagrange Multipliers S. Sawyer October 25, 2002 1. Lagrange s Theorem. Suppose that we want to maximize (or imize a function of n variables f(x = f(x 1, x 2,..., x n for x = (x 1, x 2,...,

More information

LECTURE 4. Last time: Lecture outline

LECTURE 4. Last time: Lecture outline LECTURE 4 Last time: Types of convergence Weak Law of Large Numbers Strong Law of Large Numbers Asymptotic Equipartition Property Lecture outline Stochastic processes Markov chains Entropy rate Random

More information

4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

More information

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random

More information

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about

More information

Practice Problems for Homework #8. Markov Chains. Read Sections 7.1-7.3. Solve the practice problems below.

Practice Problems for Homework #8. Markov Chains. Read Sections 7.1-7.3. Solve the practice problems below. Practice Problems for Homework #8. Markov Chains. Read Sections 7.1-7.3 Solve the practice problems below. Open Homework Assignment #8 and solve the problems. 1. (10 marks) A computer system can operate

More information

Solved Problems. Chapter 14. 14.1 Probability review

Solved Problems. Chapter 14. 14.1 Probability review Chapter 4 Solved Problems 4. Probability review Problem 4.. Let X and Y be two N 0 -valued random variables such that X = Y + Z, where Z is a Bernoulli random variable with parameter p (0, ), independent

More information

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG820). December 15, 2012.

Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG820). December 15, 2012. Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG8). December 15, 12. 1. (3p) The joint distribution of the discrete random variables X and

More information

4. Joint Distributions

4. Joint Distributions Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose

More information

Queueing Systems. Ivo Adan and Jacques Resing

Queueing Systems. Ivo Adan and Jacques Resing Queueing Systems Ivo Adan and Jacques Resing Department of Mathematics and Computing Science Eindhoven University of Technology P.O. Box 513, 5600 MB Eindhoven, The Netherlands March 26, 2015 Contents

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem Coyright c 2009 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

MTH135/STA104: Probability

MTH135/STA104: Probability MTH135/STA14: Probability Homework # 8 Due: Tuesday, Nov 8, 5 Prof Robert Wolpert 1 Define a function f(x, y) on the plane R by { 1/x < y < x < 1 f(x, y) = other x, y a) Show that f(x, y) is a joint probability

More information

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 3-7, 3-8 The remaining discrete random

More information

Fourth Problem Assignment

Fourth Problem Assignment EECS 401 Due on Feb 2, 2007 PROBLEM 1 (25 points) Joe and Helen each know that the a priori probability that her mother will be home on any given night is 0.6. However, Helen can determine her mother s

More information

Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141

Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304.  jones/courses/141 Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard

More information

Examination 110 Probability and Statistics Examination

Examination 110 Probability and Statistics Examination Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiple-choice test questions. The test is a three-hour examination

More information

On the mathematical theory of splitting and Russian roulette

On the mathematical theory of splitting and Russian roulette On the mathematical theory of splitting and Russian roulette techniques St.Petersburg State University, Russia 1. Introduction Splitting is an universal and potentially very powerful technique for increasing

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Combinatorics: The Fine Art of Counting

Combinatorics: The Fine Art of Counting Combinatorics: The Fine Art of Counting Week 7 Lecture Notes Discrete Probability Continued Note Binomial coefficients are written horizontally. The symbol ~ is used to mean approximately equal. The Bernoulli

More information

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i ) Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information

DISCRETE RANDOM VARIABLES

DISCRETE RANDOM VARIABLES DISCRETE RANDOM VARIABLES DISCRETE RANDOM VARIABLES Documents prepared for use in course B01.1305, New York University, Stern School of Business Definitions page 3 Discrete random variables are introduced

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note 11

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note 11 CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage 5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single

More information

Homework set 4 - Solutions

Homework set 4 - Solutions Homework set 4 - Solutions Math 495 Renato Feres Problems R for continuous time Markov chains The sequence of random variables of a Markov chain may represent the states of a random system recorded at

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Jan 31 Homework Solutions Math 151, Winter 2012. Chapter 3 Problems (pages 102-110)

Jan 31 Homework Solutions Math 151, Winter 2012. Chapter 3 Problems (pages 102-110) Jan 31 Homework Solutions Math 151, Winter 01 Chapter 3 Problems (pages 10-110) Problem 61 Genes relating to albinism are denoted by A and a. Only those people who receive the a gene from both parents

More information

Chapter 2, part 2. Petter Mostad

Chapter 2, part 2. Petter Mostad Chapter 2, part 2 Petter Mostad mostad@chalmers.se Parametrical families of probability distributions How can we solve the problem of learning about the population distribution from the sample? Usual procedure:

More information

UNIT 2 QUEUING THEORY

UNIT 2 QUEUING THEORY UNIT 2 QUEUING THEORY LESSON 24 Learning Objective: Apply formulae to find solution that will predict the behaviour of the single server model II. Apply formulae to find solution that will predict the

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

Binomial distribution From Wikipedia, the free encyclopedia See also: Negative binomial distribution

Binomial distribution From Wikipedia, the free encyclopedia See also: Negative binomial distribution Binomial distribution From Wikipedia, the free encyclopedia See also: Negative binomial distribution In probability theory and statistics, the binomial distribution is the discrete probability distribution

More information

The Joint Distribution of Server State and Queue Length of M/M/1/1 Retrial Queue with Abandonment and Feedback

The Joint Distribution of Server State and Queue Length of M/M/1/1 Retrial Queue with Abandonment and Feedback The Joint Distribution of Server State and Queue Length of M/M/1/1 Retrial Queue with Abandonment and Feedback Hamada Alshaer Université Pierre et Marie Curie - Lip 6 7515 Paris, France Hamada.alshaer@lip6.fr

More information

Worked examples Multiple Random Variables

Worked examples Multiple Random Variables Worked eamples Multiple Random Variables Eample Let X and Y be random variables that take on values from the set,, } (a) Find a joint probability mass assignment for which X and Y are independent, and

More information

Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Log-Normal Distribution

Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Log-Normal Distribution Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Log-ormal Distribution October 4, 200 Limiting Distribution of the Scaled Random Walk Recall that we defined a scaled simple random walk last

More information

by Dimitri P. Bertsekas and John N. Tsitsiklis SECTION 3.1. Continuous Random Variables and PDFs

by Dimitri P. Bertsekas and John N. Tsitsiklis SECTION 3.1. Continuous Random Variables and PDFs INTRODUCTION TO PROBABILITY by Dimitri P. Bertsekas and John N. Tsitsiklis CHAPTER 3: ADDITIONAL PROBLEMS SECTION 3.1. Continuous Random Variables and PDFs Problem 1. The runner-up in a road race is given

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

Introduction to Monte-Carlo Methods

Introduction to Monte-Carlo Methods Introduction to Monte-Carlo Methods Bernard Lapeyre Halmstad January 2007 Monte-Carlo methods are extensively used in financial institutions to compute European options prices to evaluate sensitivities

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

2WB05 Simulation Lecture 8: Generating random variables

2WB05 Simulation Lecture 8: Generating random variables 2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating

More information

1 IEOR 6711: Notes on the Poisson Process

1 IEOR 6711: Notes on the Poisson Process Copyright c 29 by Karl Sigman 1 IEOR 6711: Notes on the Poisson Process We present here the essentials of the Poisson point process with its many interesting properties. As preliminaries, we first define

More information

SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION. School of Mathematical Sciences. Monash University, Clayton, Victoria, Australia 3168

SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION. School of Mathematical Sciences. Monash University, Clayton, Victoria, Australia 3168 SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION Ravi PHATARFOD School of Mathematical Sciences Monash University, Clayton, Victoria, Australia 3168 In this paper we consider the problem of gambling with

More information

Stochastic Processes and Advanced Mathematical Finance. Laws of Large Numbers

Stochastic Processes and Advanced Mathematical Finance. Laws of Large Numbers Steven R. Dunbar Department of Mathematics 203 Avery Hall University of Nebraska-Lincoln Lincoln, NE 68588-0130 http://www.math.unl.edu Voice: 402-472-3731 Fax: 402-472-8466 Stochastic Processes and Advanced

More information

Random access protocols for channel access. Markov chains and their stability. Laurent Massoulié.

Random access protocols for channel access. Markov chains and their stability. Laurent Massoulié. Random access protocols for channel access Markov chains and their stability laurent.massoulie@inria.fr Aloha: the first random access protocol for channel access [Abramson, Hawaii 70] Goal: allow machines

More information

Bonus-malus systems and Markov chains

Bonus-malus systems and Markov chains Bonus-malus systems and Markov chains Dutch car insurance bonus-malus system class % increase new class after # claims 0 1 2 >3 14 30 14 9 5 1 13 32.5 14 8 4 1 12 35 13 8 4 1 11 37.5 12 7 3 1 10 40 11

More information

Queuing Theory. Long Term Averages. Assumptions. Interesting Values. Queuing Model

Queuing Theory. Long Term Averages. Assumptions. Interesting Values. Queuing Model Queuing Theory Queuing Theory Queuing theory is the mathematics of waiting lines. It is extremely useful in predicting and evaluating system performance. Queuing theory has been used for operations research.

More information

THE CENTRAL LIMIT THEOREM TORONTO

THE CENTRAL LIMIT THEOREM TORONTO THE CENTRAL LIMIT THEOREM DANIEL RÜDT UNIVERSITY OF TORONTO MARCH, 2010 Contents 1 Introduction 1 2 Mathematical Background 3 3 The Central Limit Theorem 4 4 Examples 4 4.1 Roulette......................................

More information