Applications of Matrix Theory: Markov Chains

Similar documents
Practice Problems for Homework #8. Markov Chains. Read Sections Solve the practice problems below.

6.3 Conditional Probability and Independence

Basic Probability Concepts

Prime Time: Homework Examples from ACE

Conditional Probability, Independence and Bayes Theorem Class 3, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Probability Using Dice

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m


California Treasures High-Frequency Words Scope and Sequence K-3

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10

Solution to Homework 2

Math Quizzes Winter 2009

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

s = s = s =

Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University

3.2 Roulette and Markov Chains

Pigeonhole Principle Solutions

8.3 Probability Applications of Counting Principles

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

36 Odds, Expected Value, and Conditional Probability

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

SECTION 10-2 Mathematical Induction

Section 6.2 Definition of Probability

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

Session 6 Number Theory

Chapter 4: Probability and Counting Rules

EXAM. Exam #3. Math 1430, Spring April 21, 2001 ANSWERS

Multi-state transition models with actuarial applications c

1/3 1/3 1/

25 Integers: Addition and Subtraction

Reading 13 : Finite State Automata and Regular Expressions

Phonics. High Frequency Words P.008. Objective The student will read high frequency words.

DERIVATIVES AS MATRICES; CHAIN RULE

Solutions for Practice problems on proofs

4/1/2017. PS. Sequences and Series FROM 9.2 AND 9.3 IN THE BOOK AS WELL AS FROM OTHER SOURCES. TODAY IS NATIONAL MANATEE APPRECIATION DAY

Topic 8. Chi Square Tests

Probability, statistics and football Franka Miriam Bru ckler Paris, 2015.

Combinatorial Proofs

1.2 Solving a System of Linear Equations

Decision Analysis. Here is the statement of the problem:

E3: PROBABILITY AND STATISTICS lecture notes

Lecture 17 : Equivalence and Order Relations DRAFT

Bayesian Tutorial (Sheet Updated 20 March)

THE DIMENSION OF A VECTOR SPACE

Lesson Plans for (9 th Grade Main Lesson) Possibility & Probability (including Permutations and Combinations)

Just the Factors, Ma am

Mathematical Induction

ASSIGNMENT 4 PREDICTIVE MODELING AND GAINS CHARTS

VISUAL ALGEBRA FOR COLLEGE STUDENTS. Laurie J. Burton Western Oregon University

Definition and Calculus of Probability

Math 55: Discrete Mathematics

Valentine s Day Lesson

God is Eternal Lesson 1

Book Review of Rosenhouse, The Monty Hall Problem. Leslie Burkholder 1

From the probabilities that the company uses to move drivers from state to state the next year, we get the following transition matrix:

Math 3C Homework 3 Solutions

Clock Arithmetic and Modular Systems Clock Arithmetic The introduction to Chapter 4 described a mathematical system

WRITING PROOFS. Christopher Heil Georgia Institute of Technology

Math 55: Discrete Mathematics

9.2 Summation Notation

AP Stats - Probability Review

A linear combination is a sum of scalars times quantities. Such expressions arise quite frequently and have the form

Probability Generating Functions

Lecture 16 : Relations and Functions DRAFT

Probability definitions

Unit 18 Determinants

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

Parable of The Prodigal Son

Chapter 7. Matrices. Definition. An m n matrix is an array of numbers set out in m rows and n columns. Examples. (

A Few Basics of Probability

Math 55: Discrete Mathematics

Fry Phrases Set 1. TeacherHelpForParents.com help for all areas of your child s education

Statistics 100A Homework 1 Solutions

S on n elements. A good way to think about permutations is the following. Consider the A = 1,2,3, 4 whose elements we permute with the P =

1) To take a picture is fun. It is fun to take a picture. it To master English grammar is difficult. It is difficult to master English grammar.

Ch. 13.3: More about Probability

Essentials of Stochastic Processes

Mathematical Induction

Fundamentals of Probability

Representation of functions as power series

6.042/18.062J Mathematics for Computer Science. Expected Value I

(Refer Slide Time: 2:03)

WHERE DOES THE 10% CONDITION COME FROM?

I. GROUPS: BASIC DEFINITIONS AND EXAMPLES

Markov Chains. Chapter Introduction. Specifying a Markov Chain

SHARP BOUNDS FOR THE SUM OF THE SQUARES OF THE DEGREES OF A GRAPH

Mathematical Induction. Lecture 10-11

Game Theory and Algorithms Lecture 10: Extensive Games: Critiques and Extensions

Old Testament. Part One. Created for use with young, unchurched learners Adaptable for all ages including adults

5.1 Radical Notation and Rational Exponents

MATH 140 Lab 4: Probability and the Standard Normal Distribution

3 Some Integer Functions

CONTENTS 1. Peter Kahn. Spring 2007

1 Nonzero sum games and Nash equilibria

A simple analysis of the TV game WHO WANTS TO BE A MILLIONAIRE? R

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Basics of Counting. The product rule. Product rule example. 22C:19, Chapter 6 Hantao Zhang. Sample question. Total is 18 * 325 = 5850

The Prime Numbers. Definition. A prime number is a positive integer with exactly two positive divisors.

Regular Languages and Finite Automata

Section 6-5 Sample Spaces and Probability

Transcription:

Chapter 2 Applications of Matrix Theory: Markov Chains 21 Introduction to Markov chains All that is required of probability theory is the simple notion that the probability of an outcome is the frequency with which that outcome will occur Thus in the toss of a fair die, we would attach probability 1/6 to occurrence of each face Problem 211 Ace Taxi divides a city into two zones: A and B If a driver picks up a client in A, he will drop them off in A with probability 25 and drop them off in B with probability 75 If a driver picks up a client in zone B, he will drop them off in zone A with chance 4 and drop them off in zone B with chance 6 The company rules dictate that a driver must remain in a given zone for the next client Given that the driver starts in zone A compute the probabilities that the driver will be in zones A and B after two clients have been served Given that the driver starts in zone B compute the probabilities that the driver will be in zones A and B after two clients have been served Solution We can arrange the information in a transition diagram See Figure 21 The transition diagram carries all of the probabilistic information concerning the movement of the taxis We can make tree diagrams to aid us in calculating the probabilities Starting from A, we can move to A or to B From each of those nodes, the 1

2CHAPTER 2 APPLICATIONS OF MATRIX THEORY: MARKOV CHAINS 75 25 A B 6 4 Figure 21: A transition diagram A 25 A 25 25=0625 A 25 75 75 4 B A 25 75=1875 75 4=3 B 6 B 75 6=45 Figure 22: A tree diagram same two alternatives are still available See Figure 22 Each branch of the tree corresponds to a distinct outcome of the experiment The probability of that outcome is given by multiplying the probabilities along the branches Given that the taxi starts in zone A, the taxi will be in zone A after two steps if either of the outcomes AAA or ABA occurs Accordingly the chance that the taxi is in zone A after two steps is 0625+3 = 3625 Likewise given that the taxi begins in zone A, the taxi will be in zone B after two steps with chance 1875 + 45 = 6375 Notice that these alternatives add to 1 A similar calculation would show that given that the taxi starts in zone B, the taxi will be in zone A after two steps with chance 34 and in zone B after two steps with chance 66 We will leave it to the reader to provide the necessary tree diagram for this calculation Definition 211 Let S = {s 1, s 2,, s n } be a set Each element of S is called a state and S is called the state space A Markov chain on S is a

21 INTRODUCTION TO MARKOV CHAINS 3 random process in discrete time whose value at any time is an element of S The probability that the chain makes a transition from state s i to state s j depends only on s i and s j and not on any of the preceding states In other words, the future depends only upon the present state of the chain and not on the past states of the chain Example 211 In the solution to Problem 211, the state space was S = {A, B} The probabilities of transition were codified in the transition diagram See Figure 21 Our next theorem demonstrates how matrix theory can be utilized to compute probabilities Definition 212 Let the states of a Markov chain be labeled 1, 2,, n Given two states i and j, let p ij = prob of making a transition from state i to j The transition matrix P for the Markov chain is the matrix whose (i, j) entry is p ij, that is, P = ((p ij )) Example 212 Recall the Markov chain from Problem 211 Let state A be labeled 1 and state B be labeled 2 The transition matrix is [ ] 25 75 P = 4 6 Observe that each of the row sums of the transition matrix in Example 212 are 1 This is true in general Theorem 211 If P is a transition matrix for a Markov chain, then each row of P sums to 1 Proof Let the state space be {1, 2,, n} Consider state i in the transition diagram for this Markov chain Since the Markov chain must make a transition to one of the n states with probability 1, the sum total of all of the transition probabilities out of state i must add to 1 This implies that p i1 + p i2 + + p in = 1 But this concludes our proof, since this is nothing more than the sum of the elements of the ith row of P

4CHAPTER 2 APPLICATIONS OF MATRIX THEORY: MARKOV CHAINS Table 21: U-Haul Statistics Pick-up Delivery Percent 1 1 30 1 2 20 1 3 50 2 1 40 2 2 20 2 3 40 3 1 50 3 2 20 3 3 30 Problem 212 A U-Haul franchise services three cities, labeled 1, 2 and 3 Table 21 shows the percentage of the time that a van is picked up in city i and delivered to city j The movements of an individual moving van can be modeled by a Markov chain (1) Construct the transition diagram for this Markov chain (2) Find the transition matrix for this Markov chain (3) If a van starts in state 1, what is the chance that it is in state 3 after 2 steps? Solution The transition diagram is given in Figure 23 The transition matrix is 3 2 5 P = 4 2 4 5 2 3 Finally, we can calculate the probability of starting in state 1 and being in state 3 after two steps by a tree diagram The relevant branches are 1 1 3 : chance 15 1 2 3 : chance 08 1 3 3 : chance 15

21 INTRODUCTION TO MARKOV CHAINS 5 3 1 5 2 3 3 5 2 4 4 2 2 Figure 23: A transition diagram Thus the probability in question is 15 + 08 + 15 = 38 Multi-step transitions Definition 213 Let P be a transition matrix for a Markov chain on a state space labeled S = {1, 2,, n} (1) For each integer k 1, let p ij (k) denote the probability of a transition from i to j in k steps (2) For each integer k 1, let P(k) = ((p ij (k))) denote the matrix of k-step probabilities P(k) is called the k-step transition matrix Theorem 212 Let P be the transition matrix for a Markov chain Then the k-step transition matrix P (k) is given by P(k) = P k Proof Let us assume that the state space is S = {1, 2,, n} We will proceed by induction on n It is clear that P(1) = P 1, which proves the base case Let us assume that P(k) = P k for a positive integer k Let i, j S To compute p ij (k + 1), we will partition this event according to the state of the Markov chain at time k We can organize our work in a tree diagram, as in Figure 24 Thus

6CHAPTER 2 APPLICATIONS OF MATRIX THEORY: MARKOV CHAINS 1 p 1j j p i1 (k)p 1j p i1 (k) 2 p 2j j p i2 (k)p 2j p i2 (k) i p in (k) n p nj j p in (k)p nj Figure 24: A multi-step tree [P(k + 1)] ij = p ij (k + 1) = p i1 (k)p 1j + p i2 (k)p 21 + + p in (k)p nj = [P(k)P] ij which proves our claim Example 213 Consider the transition matrix [ ] 3 7 P = 6 4 Note that P 2 = [ 51 49 42 58 ] and P 8 = [ 46157 53843 46157 53843 We may conclude that: (1) the probability of making a transition from state 1 to state 2 in 2 steps is approximately 49; (2) the probability of making a transition from state 1 to state 2 in 8 steps is approximately 5348 Example 214 Recall the transition matrix from Problem 212: 3 2 5 P = 4 2 4 5 2 3 ]

21 INTRODUCTION TO MARKOV CHAINS 7 A simple calculation reveals that P(2) = P 2 = 42 20 38 4 2 4 38 2 42 This shows that p 13 (2) = 38, which is the same answer we arrived at above Exercises 1 A Markov chain has transition matrix 2 3 5 P = 3 1 6 7 1 2 Assume that the chain begins in state 2 Find the probability that after two steps the chain will be in state 3 Calculate this in two ways: first, by a tree diagram; second, by calculating P(2) 2 A Markov chain has transition matrix [ 0 1 3 7 Given that the Markov chain begins in state 1, what is the probability that the chain will be in state 1 after 8 steps? 3 In Greenville it is either raining or shining If it is raining today, it will be raining tomorrow with chance 4 and shining with chance 6; if it is shining today, it will be raining tomorrow with chance 2 and shining with chance 8 Given that it is raining today, what are the respective chances for rain and shine in 6 days? in 20 days? 4 Each morning I eat one of three breakfast cereals: Cheerios, Special K, or Corn Flakes I will stick with the cereal I ate on the previous morning 80% of the time, but when I do switch, I choose from among the other two with equal chance Given that I am eating Cheerios today, what is the chance that I will be eating Corn Flakes for breakfast in 10 days? What am I most likely eating for breakfast in 10 days? ]

8CHAPTER 2 APPLICATIONS OF MATRIX THEORY: MARKOV CHAINS 5 It is well known that the son of a Clemson graduate will go to Clemson with chance 9 and to Furman with chance 1 Likewise the son of a Furman man will go to Clemson with chance 3 and to Furman with chance 7 Given that a man is a Furman graduate, what is the chance that his great-great grandson will attend Furman? 6 On any given day my mood can be in one of three states: happy, sad, or morose If I am happy on a given day, then on the next day I will be happy 80% of the time, sad 15% of the time, and morose 5% of the time If I am sad on a given day, then on the next day I will be happy 50% of the time, sad 20% of the time, and morose 30% of the time If I am morose on a given day, then on the next day I will be happy 40% of the time, sad 40% of the time, and morose 20% of the time Given that I am sad today, find the probability that I will be in each of the three moods in 5 days 7 Let P = 1 0 0 3 4 3 4 5 1 What is the influence of the 1 in row 1? Formulate a conjecture by examining the behavior of P(n) as n increases 8 Find n-step transition matrices for [ ] 1 0 P = and Q = 0 1 [ 0 1 1 0 Interpret the associated Markov processes in each case 9 The diffusion of a gas in a chamber can be modeled by a Markov chain We will examine a very simple case of this model Imagine that we have two buckets and three balls We will distinguish one of the buckets by coloring it red, and we will begin our experiment with all of the balls in the red bucket Each hour one of the balls is chosen at random, removed from its bucket, and placed in the other Let the state of the Markov chain be the number of balls in the red bucket Find the state space and transition matrix for this model After 5 hours, what is the most likely state of the chain? 10 A mother has many children The children want to know if they can stay out and play The mother tells A either yes or no, then A tells ]

21 INTRODUCTION TO MARKOV CHAINS 9 B, then B tells C, and so on Unfortunately these children are not good listeners: 10% of the time they hear the opposite of what of what they were told What is the chance that the 10th child hears what his mother said?

10CHAPTER 2 APPLICATIONS OF MATRIX THEORY: MARKOV CHAINS Answers 1 p 23 (2) = 33 2 p 11 (8) 23082 3 After 6 days, it will be raining with chance 25 and shining with chance 75 The same statistics hold after 20 days 4 32392; I am most likely eating Cheerios, but the margin is slim 5 132 6 happy: 69461; sad: 18940; and morose: 11599 8 P(n) = P n = P; and Q(n) = Q n = P if n is even and Q if n is odd 9 The state space is S = {0, 1, 2, 3} according to the number of balls that can be in the red bucket The transition matrix is 0 1 0 0 1/3 0 2/3 0 0 2/3 0 1/3 0 0 1 0 After 5 hours, it is most likely that the red bucket contains 1 ball; this occurs with chance 74 10 There are two states: let yes be labeled 1 and no be labeled 2 Let the state of the chain at time n be the message that the nth child hears The transition matrix is [ ] 9 1 P = 1 9 The 10th child has chance 55369 of hearing what his mother said