Introduction to Probability Theory for Graduate Economics. Fall 2008
|
|
- Judith Benson
- 7 years ago
- Views:
Transcription
1 Introduction to Probability Theory for Graduate Economics Fall 2008 Yiğit Sağlam December 1, 2008 CHAPTER 5 - STOCHASTIC PROCESSES 1 Stochastic Processes A stochastic process, or sometimes a random process, is the counterpart to a deterministic process (or a deterministic system) in probability theory. Instead of dealing with only one possible reality of how the process might evolve under time (as is the case, for example, for solutions of an ordinary differential equation), in a stochastic or random process there is some indeterminacy in its future evolution described by probability distributions. This means that even if the initial condition (or starting point) is known, there are many possibilities the process might go to, but some paths are more probable and others less. In the simplest possible case ( discrete time ), a stochastic process amounts to a sequence of random variables known as a time series. Another basic type of a stochastic process is a random field, whose domain is a region of space, in other words, a random function whose arguments are drawn from a range of continuously changing values. One approach to stochastic processes treats them as functions of one or several deterministic arguments ( inputs, in most cases regarded as time ) whose values ( outputs ) are random variables: non-deterministic (single) quantities which have certain probability distributions. Random variables corresponding to various times (or points, in the case of random fields) may be completely different. The main requirement is that these different random quantities all have the same type. Although the random values of a stochastic PhD Candidate in Economics, Department of Economics, University of Iowa, W210 Pappajohn Business Building, Iowa City, IA 52242, yigit-saglam@uiowa.edu, Phone: 1(319) , Fax: 1(319)
2 process at different times may be independent random variables, in most commonly considered situations they exhibit complicated statistical correlations. Familiar examples of processes modeled as stochastic time series include stock market and exchange rate fluctuations, and random movement such as Brownian motion or random walks. 2 The Poisson Process 2.1 Definition and Properties of a Poisson Process Definition A stochastic process {N(t), t 0} is called a counting process, if N(t) represents the total number of events that have occurred up to time t. A counting process N(t) must satisfy the following properties: N(t) 0, N(t) N, t 0, If s < t, then N(s) N(t), For s < t, N(t) N(s) equals the number of events that have occurred in the interval (s, t). Definition A counting process possesses independent increments if the number of events which occur in disjoint time intervals are independent. For example, the number of events that occurred by time 10; ie, N(10), must be independent of the number of events that occurred between times 10 and 15; ie, N(15) N(10). Definition A counting process possesses stationary increments if the distribution of the number of events that occur in any time interval depends only on the length of the time interval. In other words, if the number of events in the interval (t 1 + s, t 2 + s); ie, N(t 2 + s) N(t 1 + s) has the same distribution as the number of events in the interval (t 1, t 2 ); ie, N(t 2 ) N(t 1 ), for all t 1 < t 2, and s 0. 2
3 Definition The counting process {N(t), t 0} is called a Poisson process having rate λ, λ > 0, if 1. N(0) = 0, 2. The process has independent increments, 3. The number of events in any interval of length t is Poisson distributed with mean λt. That is, for all s, t 0: P r (N(t + s) N(s) = n) = exp ( λt) (λt)n, n = 0, 1, 2, 3,... n! It is noteworthy that it follows from condition (iii) that a Poisson process has stationary increments and also that E [N(t)] = λt which explains why λ is the rate of the process. Definition The function f( ) is said to be o( ) if f(h) lim h 0 h = 0. Definition The counting process {N(t), t 0} is called a Poisson process having rate λ, λ > 0, if 1. N(0) = 0, 2. The process has stationary and independent increments, 3. P r (N(h) = 1) = λh + o(h), 4. P r (N(h) 2) = o(h). Theorem The two definitions of the Poisson process given above are equivalent. Proof See Ross(1993). 3
4 2.2 Interarrival and Waiting Time Distributions Definition Consider a Poisson process, and let T 1 denote the time of the first event. Further, for n > 1, let T n denote the elapsed time between the (n 1) st and n th events. The sequence {T n, n = 1, 2, 3,...} is called the sequence of interarrival times. For example, T 1 = 5 and T 2 = 10, then the first event of the Poisson process would have occurred at time 5, and the second at time 15. Proposition T n, n = 1, 2, 3,..., are independently and identically distributed with exponential random variables having mean 1/λ. Remarks The assumption of stationary and independent increments is basically equivalent to asserting that, at any point in time, the process probabilistically restarts itself. In other words, the process from any point on is independent of all that has previously occurred (by independent increments), and also has the same distribution as the original process (by stationary increments). So, the process has no memory, and exponential arriving times are to be expected. Definition Another quantity of interest is S n, the arrival time of the n th event, also called the waiting time until the n th event. Mathematically, S n = n T i, n 1. i=1 Finally, it is noteworthy that S n has a gamma distribution with parameters n and λ. 2.3 Properties of the Exponential Distribution Definition A random variable X is memoryless, if P r(x t) = P r (X t + s X s) ; t, s 0. In words, the probability that the first occurrence happens at a time X t is equivalent to the probability that the first occurrence happens at time X t o + t, given that it has not yet occurred until time t o. Whenever it is appropriate, the memoryless property of Exponential distribution is useful in economics, as one does not have to keep track of the whole history to compute the probability distribution of a variable in the current state. 4
5 Remark Exponential distribution is not only memoryless, but it is also the unique distribution possessing this property. To see this, suppose that X is memoryless, and F (x) = P r (X > x). Then, it follows that: F (t + s) = F (t) F (s). That is, F (x) satisfies the functional equation: g(t + s) = g(t) g(s). However, the only right continuous solution of this functional equation is g(x) = exp ( λx) and since a distribution function is always right continuous, it implies that F (x) = exp ( λx) F (x) = P r (X x) = 1 exp ( λx) which shows that X is exponentially distributed. Definition The memoryless property is further illustrated by the failure (hazard) rate function of the exponential distribution: r(t) = f(t) 1 F (t). One can interpret the hazard rate as the probability that X will not survive for an additional time dt, given that X has survived until t. Since the memoryless property is associated with the exponential distribution, the hazard rate becomes r(t) = f(t) λ exp ( λt) = = λ. 1 F (t) exp ( λt) Thus, the hazard rate of the exponential distribution equals the reciprocal of the inverse of the mean of the exponential distribution. 5
6 3 Discrete Time Discrete Space Markov Chains 3.1 Definitions First we consider a discrete time Markov chain. In this setup, a stochastic process {X n, n = 0, 1, 2,...} takes on a finite or countable number of possible values. If X n = i, then the process is said to be in state i at time n. We suppose that whenever the process is in state i, there is a fixed probability P ij that it will next be in state j; ie: P r (X n+1 = j X n = i, X n 1 = i n 1,..., X 1 = i 1, X 0 = i 0 ) = P ij for all states i 0, i 1,..., i n 1, i, j and all n 0. Such a stochastic process is called Markov Chain. One can interpret this equation int he following way: The conditional distribution of any future state X n+1 given the past states X 0, X 1,..., X n 1 and the present state X n, is independent of the past states and depends only on the present state. Since the probabilities are nonnegative, and since the process must make a transition into some state, we have that: Every probability must be positive and less than 1: P ij 0; i, j 0, The sum of the probabilities must equal 1: P ij = 1; i = 0, 1, 2,.... j=0 6
7 Definition Chapman-Kolmogorov Equations We now define the n-step transition probabilities P n ij transitions; ie: to be the probability that a process in state i will in state j after n additional P n ij = P r (X n+m = j X m = i), n 0, i, j 0 It is noteworthy that the matrix P is called Transition Matrix: P 00 P 01 P P 10 P 11 P P =... P i0 P i1 P i The Chapman-Kolmogorov equations provide a method for computing these n-step transition probabilities: P n+m ij = k=0 Pik n P kj m, n, m 0, all i, j and are most easily understood by noting that Pik n P kj m represents the probability that starting in i, the process will go to state j in n + m transition thrugh a path which takes it int state k at the n th transition. Hence: P n+m ij = P r (X n+m = j X 0 = i) = P r (X n+m = j, X n = k X 0 = i) = = k=0 P r (X n+m = j X n = k, X 0 = i) P r (X n = k X 0 = i) k=0 k=0 P n ik P m kj In matrix notation, above equation implies: P (n) = P n where P (n) denotes the matrix of n-step transition probabilities P n ij. 7
8 3.2 Limiting Probabilities Definition A Markov chain is called irreducible, if for any i j, P n ij is positive, for n 0. Definition If state i is recurrent, then starting in state i, the process will reenter state i again and again and again in fact, infinitely often. So, the state i is recurrent if: n=1 P n ii =. Definition If state i is recurrent, then it is said to be positive recurrent if, starting in state i, the expected time until the process returns to state i is finite. In a finite-state Markov chain, all recurrent states are positive recurrent. Definition If state i is transient, then every time process enters state i, there will be positive probability that it will never again enter that state. Thus, if state i is transient, then starting in state i, the number of time periods that the process will be in state i has a geometric distribution with finite mean. So, the state i is transient if: n=1 P n ii <. Definition State i has period d, if P n ii = 0 whenever n is not divisible by d, which is the largest integer with this property. For example, starting in state i, it may be possible for the process to enter state i only at the times 2, 4, 6,..., in which case state i has period 2. A state with period 1 is aperiodic. Definition Positive recurrent, aperiodic states are called ergodic. Theorem For an irreducible ergodic Markov chain lim n P n ij Furthermore, letting exists and is independent of i. then π j is the nonnegative solution of π j = lim P n n ij, j 0, π j = π i P ij, j 0, i=0 π j = 1. j=1 8
9 Remarks It can be shown that π j, the limiting probability that the process will be in state j at time n, also equals the long-run proportion of time that the process will be in state j. These long-run proportions are often called stationary probabilities. Also, the vector Π = (π 0, π 1, π 2,...) is also referred to as invariant distribution. This is because if the initial state is chosen according to probabilities π j, j 0, then the probability of being in state j at any time n is also equal to π j : P r (X 0 = j) = π j P r (X n = j) = π j ; n, j 0. Moreover, the proportion of time in state j equals the inverse of the mean time between visits to j. This result is a special case of The Strong Law of Renewal Process. Definition Time Reversibility The idea of a reversible Markov chain comes from the ability to invert a conditional probability using Bayes Rule: P r (X n = i X n+1 = j) = P r (X n = i, X n+1 = j) P r (X n+1 = j) = P r (X n = i) P r (X n+1 = j X n = i). P r (X n+1 = j) It now appears that time has been reversed. Thus, a Markov chain is said to be reversible if there is a Π such that: π i P ij = π j P ji, i, j 0. This condition is also known as the detailed balance condition. Summing over i gives π i P ij = π j, j 0. i=0 So, for reversible Markov chains, Π is always a stationary distribution. 9
10 3.3 Computing an Invariant Distribution and Simulating Markov Chains Below is a MATLAB code to compute the invariant distribution given a transition matrix for a discrete time Markov chain: Input 1: MATLAB Code to Find Invariant Distribution: function[invd] = invdist(tm) % invdist.m - function %% CODE N = size(tm,2); B = (TM-eye(N)); B(1:N,N) = ones(n,1); % o = zeros(1,n); o(n) = 1; INVD = o*inv(b); % Invariant Distribution % Below is a MATLAB code to simulate a stochastic process which follows a discrete time Markov chain: Input 2: MATLAB Code to Simulate Markov Chains: function [theta,ug] = simulationmc(g,tm,t) % simulation - function %% Arguments: % G = grid points for the Markov chain; % TM = transition matrix for the Markov chain; % T = number of periods to simulate; %% CODE: CDF = cumsum(tm,2); % cumulative distribution function. i = 1; % index of the initial shock. ind = zeros(1,t); ind(1) = i; for t = 1:T; % simulation for T periods. ug(t) = rand(1); % create pseudo-random numbers from the uniform dist. j = find(cdf(i,:)>=ug(t),1, First ); ind(t) = j; i = j; end theta = G(ind); % pseudo-random numbers from the Markov chain for simulation. % 10
11 3.4 Simple Example Grid Points Suppose there are uncertainty is caused by a stochastic shock which follows a known discrete space Markov Chain. First, let G denote the grid points for the shock. >> G = 1:5; Thus, the grid points are the integers from 1 to 5. Transition Matrix One can also create a transition matrix for the shock in MATLAB in the following way: >> rand( state,12356); % Seed the uniform, pseudo-random number generator. TM = rand(length(g)); for i=1:length(g); TM(i,:) = TM(i,:)./sum(TM(i,:)); end TM = One can interpret the transition matrix in the following way: The number is the probability to go stay in state 1, given that the current state is already 1. Similarly, the number is the probability to go to state 3, given that the current state is 4. 11
12 Cumulative Distribution Function Now, one can easily compute the cumulative distribution function (CDF) in the following way: >> CDF = cumsum(tm,2); CDF = One can interpret the cumulative distribution function in the following way: The number is the probability that tomorrow state is less than or equal to 2, given that the current state is 1. Similarly, the number is the probability that tomorrow s state is less than or equal to 3, given that the current state is 5. Invariant Distribution Now, one can easily compute the invariant distribution (INVD) using the transition matrix in the following way: >> INVD = invdist(tm); INVD = One can interpret the invariant distribution in the following way: The number is the probability to go to state 1, regardless of the current state. Similarly, the number is the probability to go to state 3, regardless of the current state. 12
13 Simulating from the Transition Matrix One can simulate the shock from the Markov chain using the transition matrix for a fixed number of periods (T) in the following way: >> T = 10; >> [theta,ug] = simulationmc(g,tm,t); theta = ug = In this setup, let theta and ug denote the shock value and the respective probability draw in a given period. One can interpret the simulation results in the following way: Starting from state 1 (fixed inside the code), the state in period 1 is theta = 5. This is because looking at the first row of the CDF, the probability draw ug = corresponds to the range (0.8751, 1]. Thus, the shock value equals 5 in the first period. In the second period, the shock value is theta = 4, as the probability draw ug = corresponds to the range (0.4369, ], the fifth row of the CDF. It is noteworthy that the simulation results theta may change from one trial to another. This is because the pseudo-random number generation is not fixed with a given state in the code simulationmc.m. Example Example.m in the zip file for Chapter 5 contains the steps carried out above. 13
14 4 Continuous Time Continuous Space Stochastic Processes Since most of the definitions and theorems in discrete time Markov chains can be extended to continuous time Markov chains, we will rather focus on a particular application of the continuous time Markov chains in economics. 4.1 Approximating A First-Order Autoregressive AR(1) Process With Finite Space Markov Chains Procedure Suppose that a stochastic shock follows the following AR(1) process: s = µ + λs + ɛ, with var(ɛ) = σ 2, where λ < 1. (1) Here ɛ is white noise; ie, ɛ is distributed with mean zero and variance σ 2. To discretize the AR(1) process, one must assume the process stays within a bounded interval to be able to solve the problem. To use the approach laid out in Tauchen (1986). Specifically, Tauchen (1986) considers an AR(1) process like the one in (1), where µ = 0. Denote the distribution function of ɛ as F (ɛ). We are going to approximate the continuous process in (1) by a discrete process with N values where the shock s can take on the values s 1 <... < s N. To determine the values s i take on, Tauchen recommends the following: ( ) 1 σ 2 2 s N mσ s = m ɛ, 1 λ 2 s 1 = s N, and s i = s 1 + (i 1)( s N s 1 ) N 1 (ie, use equidistant spacing). From this, one can calculate a transition matrix. Define p ij Prob(s(t) = s j s(t 1) = s i ), this would be the element in the transition matrix row i and column j - such that j p ij = 1. For j [2, N 1], p ij can be computed as ( ) ( ) sj λ s i + w/2 sj λ s i w/2 p ij = F F σ ɛ σ ɛ where w = s k s k 1 (note that given we are using equidistant points this value is the same for all k). These expressions can be thought of as the probability that λ s i +ɛ [ s j w/2, s j +w/2]. 14
15 Otherwise the probability of transitioning from state i into state 1 is ( ) s1 λ s i + w/2 p i1 = F σ ɛ and the probability of going from state i to state N is ( ) sn λ s i w/2 p in = 1 F. σ ɛ This discrete approximation to the conditional distribution of s(t) given s(t 1) will converge in the sense of probability to the true conditional distribution for the stochastic process articulated in (1). One practical concern for the above approach is how to deal with negative values for the shock. Specifically this means the firm s technology produces negative output, which does not make much economic sense. To prevent this situation we transform the shocks by letting s = exp(s) which ensures all values of the shock are positive. Another benefit of this transformation is that the grid becomes finer at the lower end and more coarse for high shock values. Example Example2.m in the zip file for Chapter 5 provides a simple implementation of the algorithm. Please refer to the markovprob.m file on the next page for a way to implement the algorithm. 5 References Ross, Sheldon M. Introduction to Probability Models. Fifth Edition. San Diego: Academic Press, Silos, Pedro. Assessing Markov-Chain Approximations: A Minimal Econometric Approach. Journal of Economic Dynamics and Control, 30 (2006), Tauchen, George. Finite State Markov-Chain Approximations to Univariate and Vector Autoregressions. Economic Letters, 20 (1986),
16 Implementation Tauchen Algorithm. Input 3: MATLAB Code for Tauchen Algorithm: function [Gl, TM, CDF, INVD, sz] = markovprob(mue, p, s, N, m) % markovprob - function %% Arguments: % mue = intercept of AR(1) process; % p = slope coeff. of AR(1) process; % s = std. dev. of residuals in AR(1) process; % N = # of grid points for the z variable; % m = Density of the grid for z variable; %% CODE: sz = s / ((1-p 2) (1/2)); % Std. Dev. of z. zmin = -m * sz + mue/(1-p); zmax = m * sz + mue/(1-p); z = linspace(zmin,zmax,n); % Grid Points %% Transition Matrix: TM = zeros(n,n); % Transition Matrix w = z(n) - z(n-1); for j = 1:N; TM(j,1) = cdf( norm,(z(1)+w/2-mue-p*z(j))/s,0,1); TM(j,N) = 1 - cdf( norm,(z(n)-w/2-mue-p*z(j))/s,0,1); for k = 2:N-1; TM(j,k) = cdf( norm,(z(k)+w/2-mue-p*z(j))/s,0,1)-... cdf( norm,(z(k)-w/2-mue-p*z(j))/s,0,1); end end %% Cumulative Distribution Function: CDF = cumsum(tm,2); %% Invariant Distribution: INVD = invdist(tm) ; % Invariant Distribution %% Grids: Gl = exp(z ); fprintf( If we have a lognormal var. (log(z)) in AR(1) process, \n ); fprintf( To make the interval finer at the lower end and coarser at the upper end.\n ); % 16
The Exponential Distribution
21 The Exponential Distribution From Discrete-Time to Continuous-Time: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding
More informationOverview of Monte Carlo Simulation, Probability Review and Introduction to Matlab
Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?
More informationIEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem
IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have
More informationLECTURE 4. Last time: Lecture outline
LECTURE 4 Last time: Types of convergence Weak Law of Large Numbers Strong Law of Large Numbers Asymptotic Equipartition Property Lecture outline Stochastic processes Markov chains Entropy rate Random
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
More informationExponential Distribution
Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1
More informationMonte Carlo Methods in Finance
Author: Yiyang Yang Advisor: Pr. Xiaolin Li, Pr. Zari Rachev Department of Applied Mathematics and Statistics State University of New York at Stony Brook October 2, 2012 Outline Introduction 1 Introduction
More information1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let
Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as
More informationIEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS
IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS There are four questions, each with several parts. 1. Customers Coming to an Automatic Teller Machine (ATM) (30 points)
More informationChapter 2: Binomial Methods and the Black-Scholes Formula
Chapter 2: Binomial Methods and the Black-Scholes Formula 2.1 Binomial Trees We consider a financial market consisting of a bond B t = B(t), a stock S t = S(t), and a call-option C t = C(t), where the
More informationNotes on Continuous Random Variables
Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes
More informationMaster s Theory Exam Spring 2006
Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem
More informationBrownian Motion and Stochastic Flow Systems. J.M Harrison
Brownian Motion and Stochastic Flow Systems 1 J.M Harrison Report written by Siva K. Gorantla I. INTRODUCTION Brownian motion is the seemingly random movement of particles suspended in a fluid or a mathematical
More information1 Sufficient statistics
1 Sufficient statistics A statistic is a function T = rx 1, X 2,, X n of the random sample X 1, X 2,, X n. Examples are X n = 1 n s 2 = = X i, 1 n 1 the sample mean X i X n 2, the sample variance T 1 =
More information1 The Brownian bridge construction
The Brownian bridge construction The Brownian bridge construction is a way to build a Brownian motion path by successively adding finer scale detail. This construction leads to a relatively easy proof
More informationECON20310 LECTURE SYNOPSIS REAL BUSINESS CYCLE
ECON20310 LECTURE SYNOPSIS REAL BUSINESS CYCLE YUAN TIAN This synopsis is designed merely for keep a record of the materials covered in lectures. Please refer to your own lecture notes for all proofs.
More informationM/M/1 and M/M/m Queueing Systems
M/M/ and M/M/m Queueing Systems M. Veeraraghavan; March 20, 2004. Preliminaries. Kendall s notation: G/G/n/k queue G: General - can be any distribution. First letter: Arrival process; M: memoryless - exponential
More informationFEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL
FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint
More informationSimulating Stochastic Differential Equations
Monte Carlo Simulation: IEOR E473 Fall 24 c 24 by Martin Haugh Simulating Stochastic Differential Equations 1 Brief Review of Stochastic Calculus and Itô s Lemma Let S t be the time t price of a particular
More informationLecture 7: Continuous Random Variables
Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider
More informationChapter 3 RANDOM VARIATE GENERATION
Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.
More informationFinancial Mathematics and Simulation MATH 6740 1 Spring 2011 Homework 2
Financial Mathematics and Simulation MATH 6740 1 Spring 2011 Homework 2 Due Date: Friday, March 11 at 5:00 PM This homework has 170 points plus 20 bonus points available but, as always, homeworks are graded
More informatione.g. arrival of a customer to a service station or breakdown of a component in some system.
Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be
More informationRandom access protocols for channel access. Markov chains and their stability. Laurent Massoulié.
Random access protocols for channel access Markov chains and their stability laurent.massoulie@inria.fr Aloha: the first random access protocol for channel access [Abramson, Hawaii 70] Goal: allow machines
More informationA Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails
12th International Congress on Insurance: Mathematics and Economics July 16-18, 2008 A Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails XUEMIAO HAO (Based on a joint
More information5. Continuous Random Variables
5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be
More informationNumerical Methods for Option Pricing
Chapter 9 Numerical Methods for Option Pricing Equation (8.26) provides a way to evaluate option prices. For some simple options, such as the European call and put options, one can integrate (8.26) directly
More informationLECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process
LECTURE 16 Readings: Section 5.1 Lecture outline Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process Number of successes Distribution of interarrival times The
More informationNonparametric adaptive age replacement with a one-cycle criterion
Nonparametric adaptive age replacement with a one-cycle criterion P. Coolen-Schrijner, F.P.A. Coolen Department of Mathematical Sciences University of Durham, Durham, DH1 3LE, UK e-mail: Pauline.Schrijner@durham.ac.uk
More informationOverview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written
More information9.2 Summation Notation
9. Summation Notation 66 9. Summation Notation In the previous section, we introduced sequences and now we shall present notation and theorems concerning the sum of terms of a sequence. We begin with a
More informationDepartment of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.
Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x
More informationSection 5.1 Continuous Random Variables: Introduction
Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,
More informationConfidence Intervals for Exponential Reliability
Chapter 408 Confidence Intervals for Exponential Reliability Introduction This routine calculates the number of events needed to obtain a specified width of a confidence interval for the reliability (proportion
More information4 The M/M/1 queue. 4.1 Time-dependent behaviour
4 The M/M/1 queue In this chapter we will analyze the model with exponential interarrival times with mean 1/λ, exponential service times with mean 1/µ and a single server. Customers are served in order
More informationAnalysis of a Production/Inventory System with Multiple Retailers
Analysis of a Production/Inventory System with Multiple Retailers Ann M. Noblesse 1, Robert N. Boute 1,2, Marc R. Lambrecht 1, Benny Van Houdt 3 1 Research Center for Operations Management, University
More informationProbability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur
Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce
More informationLoad Balancing and Switch Scheduling
EE384Y Project Final Report Load Balancing and Switch Scheduling Xiangheng Liu Department of Electrical Engineering Stanford University, Stanford CA 94305 Email: liuxh@systems.stanford.edu Abstract Load
More informationHomework set 4 - Solutions
Homework set 4 - Solutions Math 495 Renato Feres Problems R for continuous time Markov chains The sequence of random variables of a Markov chain may represent the states of a random system recorded at
More informationSome stability results of parameter identification in a jump diffusion model
Some stability results of parameter identification in a jump diffusion model D. Düvelmeyer Technische Universität Chemnitz, Fakultät für Mathematik, 09107 Chemnitz, Germany Abstract In this paper we discuss
More informationBasics of Statistical Machine Learning
CS761 Spring 2013 Advanced Machine Learning Basics of Statistical Machine Learning Lecturer: Xiaojin Zhu jerryzhu@cs.wisc.edu Modern machine learning is rooted in statistics. You will find many familiar
More informationFluid models in performance analysis
Fluid models in performance analysis Marco Gribaudo 1, Miklós Telek 2 1 Dip. di Informatica, Università di Torino, marcog@di.unito.it 2 Dept. of Telecom., Technical University of Budapest telek@hitbme.hu
More informationProbability Generating Functions
page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence
More informationBinomial lattice model for stock prices
Copyright c 2007 by Karl Sigman Binomial lattice model for stock prices Here we model the price of a stock in discrete time by a Markov chain of the recursive form S n+ S n Y n+, n 0, where the {Y i }
More informationOnline Appendix. Supplemental Material for Insider Trading, Stochastic Liquidity and. Equilibrium Prices. by Pierre Collin-Dufresne and Vyacheslav Fos
Online Appendix Supplemental Material for Insider Trading, Stochastic Liquidity and Equilibrium Prices by Pierre Collin-Dufresne and Vyacheslav Fos 1. Deterministic growth rate of noise trader volatility
More informationSupplement to Call Centers with Delay Information: Models and Insights
Supplement to Call Centers with Delay Information: Models and Insights Oualid Jouini 1 Zeynep Akşin 2 Yves Dallery 1 1 Laboratoire Genie Industriel, Ecole Centrale Paris, Grande Voie des Vignes, 92290
More informationSPARE PARTS INVENTORY SYSTEMS UNDER AN INCREASING FAILURE RATE DEMAND INTERVAL DISTRIBUTION
SPARE PARS INVENORY SYSEMS UNDER AN INCREASING FAILURE RAE DEMAND INERVAL DISRIBUION Safa Saidane 1, M. Zied Babai 2, M. Salah Aguir 3, Ouajdi Korbaa 4 1 National School of Computer Sciences (unisia),
More information4.1 Introduction and underlying setup
Chapter 4 Semi-Markov processes in labor market theory 4.1 Introduction and underlying setup Semi-Markov processes are, like all stochastic processes, models of systems or behavior. As extensions of Markov
More informationProbability and Random Variables. Generation of random variables (r.v.)
Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly
More informationHedging Options In The Incomplete Market With Stochastic Volatility. Rituparna Sen Sunday, Nov 15
Hedging Options In The Incomplete Market With Stochastic Volatility Rituparna Sen Sunday, Nov 15 1. Motivation This is a pure jump model and hence avoids the theoretical drawbacks of continuous path models.
More information6 Scalar, Stochastic, Discrete Dynamic Systems
47 6 Scalar, Stochastic, Discrete Dynamic Systems Consider modeling a population of sand-hill cranes in year n by the first-order, deterministic recurrence equation y(n + 1) = Ry(n) where R = 1 + r = 1
More informationMATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.
MATH10212 Linear Algebra Textbook: D. Poole, Linear Algebra: A Modern Introduction. Thompson, 2006. ISBN 0-534-40596-7. Systems of Linear Equations Definition. An n-dimensional vector is a row or a column
More informationContinued Fractions and the Euclidean Algorithm
Continued Fractions and the Euclidean Algorithm Lecture notes prepared for MATH 326, Spring 997 Department of Mathematics and Statistics University at Albany William F Hammond Table of Contents Introduction
More informationGenerating Random Variables and Stochastic Processes
Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Generating Random Variables and Stochastic Processes 1 Generating U(0,1) Random Variables The ability to generate U(0, 1) random variables
More informationGenerating Random Variables and Stochastic Processes
Monte Carlo Simulation: IEOR E4703 c 2010 by Martin Haugh Generating Random Variables and Stochastic Processes In these lecture notes we describe the principal methods that are used to generate random
More informationMaximum Likelihood Estimation
Math 541: Statistical Theory II Lecturer: Songfeng Zheng Maximum Likelihood Estimation 1 Maximum Likelihood Estimation Maximum likelihood is a relatively simple method of constructing an estimator for
More informationPrinciple of Data Reduction
Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then
More informationMATH 4330/5330, Fourier Analysis Section 11, The Discrete Fourier Transform
MATH 433/533, Fourier Analysis Section 11, The Discrete Fourier Transform Now, instead of considering functions defined on a continuous domain, like the interval [, 1) or the whole real line R, we wish
More informationProbability Calculator
Chapter 95 Introduction Most statisticians have a set of probability tables that they refer to in doing their statistical wor. This procedure provides you with a set of electronic statistical tables that
More informationLogNormal stock-price models in Exams MFE/3 and C/4
Making sense of... LogNormal stock-price models in Exams MFE/3 and C/4 James W. Daniel Austin Actuarial Seminars http://www.actuarialseminars.com June 26, 2008 c Copyright 2007 by James W. Daniel; reproduction
More informationEXAM 3, FALL 003 Please note: On a one-time basis, the CAS is releasing annotated solutions to Fall 003 Examination 3 as a study aid to candidates. It is anticipated that for future sittings, only the
More informationQUEUING THEORY. 1. Introduction
QUEUING THEORY RYAN BERRY Abstract. This paper defines the building blocks of and derives basic queuing systems. It begins with a review of some probability theory and then defines processes used to analyze
More informationTenth Problem Assignment
EECS 40 Due on April 6, 007 PROBLEM (8 points) Dave is taking a multiple-choice exam. You may assume that the number of questions is infinite. Simultaneously, but independently, his conscious and subconscious
More information10.2 Series and Convergence
10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and
More informationMarkov Chain Monte Carlo Simulation Made Simple
Markov Chain Monte Carlo Simulation Made Simple Alastair Smith Department of Politics New York University April2,2003 1 Markov Chain Monte Carlo (MCMC) simualtion is a powerful technique to perform numerical
More informationSECOND DERIVATIVE TEST FOR CONSTRAINED EXTREMA
SECOND DERIVATIVE TEST FOR CONSTRAINED EXTREMA This handout presents the second derivative test for a local extrema of a Lagrange multiplier problem. The Section 1 presents a geometric motivation for the
More informationLecture 2 ESTIMATING THE SURVIVAL FUNCTION. One-sample nonparametric methods
Lecture 2 ESTIMATING THE SURVIVAL FUNCTION One-sample nonparametric methods There are commonly three methods for estimating a survivorship function S(t) = P (T > t) without resorting to parametric models:
More informationGenerating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan. 15.450, Fall 2010
Simulation Methods Leonid Kogan MIT, Sloan 15.450, Fall 2010 c Leonid Kogan ( MIT, Sloan ) Simulation Methods 15.450, Fall 2010 1 / 35 Outline 1 Generating Random Numbers 2 Variance Reduction 3 Quasi-Monte
More informationSection 1.3 P 1 = 1 2. = 1 4 2 8. P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., = 1 2 4.
Difference Equations to Differential Equations Section. The Sum of a Sequence This section considers the problem of adding together the terms of a sequence. Of course, this is a problem only if more than
More informationLecture L3 - Vectors, Matrices and Coordinate Transformations
S. Widnall 16.07 Dynamics Fall 2009 Lecture notes based on J. Peraire Version 2.0 Lecture L3 - Vectors, Matrices and Coordinate Transformations By using vectors and defining appropriate operations between
More informationCHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.
Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,
More informationTwo Topics in Parametric Integration Applied to Stochastic Simulation in Industrial Engineering
Two Topics in Parametric Integration Applied to Stochastic Simulation in Industrial Engineering Department of Industrial Engineering and Management Sciences Northwestern University September 15th, 2014
More informationHydrodynamic Limits of Randomized Load Balancing Networks
Hydrodynamic Limits of Randomized Load Balancing Networks Kavita Ramanan and Mohammadreza Aghajani Brown University Stochastic Networks and Stochastic Geometry a conference in honour of François Baccelli
More informationUnified Lecture # 4 Vectors
Fall 2005 Unified Lecture # 4 Vectors These notes were written by J. Peraire as a review of vectors for Dynamics 16.07. They have been adapted for Unified Engineering by R. Radovitzky. References [1] Feynmann,
More informationCITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION
No: CITY UNIVERSITY LONDON BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2005 Date: August
More information3. Regression & Exponential Smoothing
3. Regression & Exponential Smoothing 3.1 Forecasting a Single Time Series Two main approaches are traditionally used to model a single time series z 1, z 2,..., z n 1. Models the observation z t as a
More informationINDIRECT INFERENCE (prepared for: The New Palgrave Dictionary of Economics, Second Edition)
INDIRECT INFERENCE (prepared for: The New Palgrave Dictionary of Economics, Second Edition) Abstract Indirect inference is a simulation-based method for estimating the parameters of economic models. Its
More informationFor a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
More information2WB05 Simulation Lecture 8: Generating random variables
2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating
More informationSection 6.1 Joint Distribution Functions
Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function
More informationProbability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..
Probability Theory A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,
More informationCS 522 Computational Tools and Methods in Finance Robert Jarrow Lecture 1: Equity Options
CS 5 Computational Tools and Methods in Finance Robert Jarrow Lecture 1: Equity Options 1. Definitions Equity. The common stock of a corporation. Traded on organized exchanges (NYSE, AMEX, NASDAQ). A common
More informationTesting against a Change from Short to Long Memory
Testing against a Change from Short to Long Memory Uwe Hassler and Jan Scheithauer Goethe-University Frankfurt This version: December 9, 2007 Abstract This paper studies some well-known tests for the null
More informationa 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2.
Chapter 1 LINEAR EQUATIONS 1.1 Introduction to linear equations A linear equation in n unknowns x 1, x,, x n is an equation of the form a 1 x 1 + a x + + a n x n = b, where a 1, a,..., a n, b are given
More informationTime Series Analysis
Time Series Analysis Autoregressive, MA and ARMA processes Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 212 Alonso and García-Martos
More informationMath 431 An Introduction to Probability. Final Exam Solutions
Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <
More informationFinite Differences Schemes for Pricing of European and American Options
Finite Differences Schemes for Pricing of European and American Options Margarida Mirador Fernandes IST Technical University of Lisbon Lisbon, Portugal November 009 Abstract Starting with the Black-Scholes
More informationBig Data Technology Motivating NoSQL Databases: Computing Page Importance Metrics at Crawl Time
Big Data Technology Motivating NoSQL Databases: Computing Page Importance Metrics at Crawl Time Edward Bortnikov & Ronny Lempel Yahoo! Labs, Haifa Class Outline Link-based page importance measures Why
More informationGeneral Sampling Methods
General Sampling Methods Reference: Glasserman, 2.2 and 2.3 Claudio Pacati academic year 2016 17 1 Inverse Transform Method Assume U U(0, 1) and let F be the cumulative distribution function of a distribution
More information1.5 / 1 -- Communication Networks II (Görg) -- www.comnets.uni-bremen.de. 1.5 Transforms
.5 / -- Communication Networks II (Görg) -- www.comnets.uni-bremen.de.5 Transforms Using different summation and integral transformations pmf, pdf and cdf/ccdf can be transformed in such a way, that even
More informationMonte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091)
Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 5 Sequential Monte Carlo methods I February
More informationInstitute of Actuaries of India Subject CT3 Probability and Mathematical Statistics
Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics For 2015 Examinations Aim The aim of the Probability and Mathematical Statistics subject is to provide a grounding in
More informationTesting against a Change from Short to Long Memory
Testing against a Change from Short to Long Memory Uwe Hassler and Jan Scheithauer Goethe-University Frankfurt This version: January 2, 2008 Abstract This paper studies some well-known tests for the null
More informationPerformance Analysis of a Telephone System with both Patient and Impatient Customers
Performance Analysis of a Telephone System with both Patient and Impatient Customers Yiqiang Quennel Zhao Department of Mathematics and Statistics University of Winnipeg Winnipeg, Manitoba Canada R3B 2E9
More informationMarkov random fields and Gibbs measures
Chapter Markov random fields and Gibbs measures 1. Conditional independence Suppose X i is a random element of (X i, B i ), for i = 1, 2, 3, with all X i defined on the same probability space (.F, P).
More informationSystems of Linear Equations
Systems of Linear Equations Beifang Chen Systems of linear equations Linear systems A linear equation in variables x, x,, x n is an equation of the form a x + a x + + a n x n = b, where a, a,, a n and
More informationSome Research Problems in Uncertainty Theory
Journal of Uncertain Systems Vol.3, No.1, pp.3-10, 2009 Online at: www.jus.org.uk Some Research Problems in Uncertainty Theory aoding Liu Uncertainty Theory Laboratory, Department of Mathematical Sciences
More informationAdaptive Search with Stochastic Acceptance Probabilities for Global Optimization
Adaptive Search with Stochastic Acceptance Probabilities for Global Optimization Archis Ghate a and Robert L. Smith b a Industrial Engineering, University of Washington, Box 352650, Seattle, Washington,
More informationSTAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE
STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about
More informationCentre for Central Banking Studies
Centre for Central Banking Studies Technical Handbook No. 4 Applied Bayesian econometrics for central bankers Andrew Blake and Haroon Mumtaz CCBS Technical Handbook No. 4 Applied Bayesian econometrics
More information