Practice Problems for Homework #8. Markov Chains. Read Sections 7.1-7.3. Solve the practice problems below.



Similar documents
1/3 1/3 1/

Math 55: Discrete Mathematics

Thursday, October 18, 2001 Page: 1 STAT 305. Solutions

3.2 Roulette and Markov Chains

From the probabilities that the company uses to move drivers from state to state the next year, we get the following transition matrix:

Discrete Mathematics: Homework 7 solution. Due:

Chapter 4 Lecture Notes

Practice Midterm IND E 411 Winter 2015 February 2, 2015

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

Bayesian Tutorial (Sheet Updated 20 March)

5.3 The Cross Product in R 3

CHAPTER 2 Estimating Probabilities

The normal approximation to the binomial

88 CHAPTER 2. VECTOR FUNCTIONS. . First, we need to compute T (s). a By definition, r (s) T (s) = 1 a sin s a. sin s a, cos s a

WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM

Math Quizzes Winter 2009

Introduction to Game Theory IIIii. Payoffs: Probability and Expected Utility

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10

Section Continued

Lecture 10: Depicting Sampling Distributions of a Sample Proportion

Alabama Department of Postsecondary Education

CHAPTER 7 STOCHASTIC ANALYSIS OF MANPOWER LEVELS AFFECTING BUSINESS 7.1 Introduction

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!

Year 9 set 1 Mathematics notes, to accompany the 9H book.

What Is Probability?

In the situations that we will encounter, we may generally calculate the probability of an event

Math 55: Discrete Mathematics

25 Integers: Addition and Subtraction

The normal approximation to the binomial

MOVIES, GAMBLING, SECRET CODES, JUST MATRIX MAGIC

1. Overconfidence {health care discussion at JD s} 2. Biased Judgments. 3. Herding. 4. Loss Aversion

AP Statistics 7!3! 6!

Section 1.3 P 1 = 1 2. = P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., =

Answers to Basic Algebra Review

Math/Stats 342: Solutions to Homework

Lecture 11 Uncertainty

That s Not Fair! ASSESSMENT #HSMA20. Benchmark Grades: 9-12

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Linearly Independent Sets and Linearly Dependent Sets

Lesson 1. Basics of Probability. Principles of Mathematics 12: Explained! 314

Lecture 11. Sergei Fedotov Introduction to Financial Mathematics. Sergei Fedotov (University of Manchester) / 7

Reading 13 : Finite State Automata and Regular Expressions

Lecture 14: Section 3.3

Maths Workshop for Parents 2. Fractions and Algebra

BINOMIAL DISTRIBUTION

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

STAT 35A HW2 Solutions

Modeling Sewer Pipe Deterioration for Risk Based Planning using a Markov Chain Model. Christopher Pawlowski Shawn Loew

STA 371G: Statistics and Modeling

Chapter 7 Probability. Example of a random circumstance. Random Circumstance. What does probability mean?? Goals in this chapter

NF5-12 Flexibility with Equivalent Fractions and Pages

CONTENTS. Please note:

Revision Notes Adult Numeracy Level 2

TCM040 MSOM NEW:TCM040 MSOM 01/07/ :13 Page 1. A book of games to play with children

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Markov Chains. Chapter Introduction. Specifying a Markov Chain

Unit 4 The Bernoulli and Binomial Distributions

Curriculum Map Statistics and Probability Honors (348) Saugus High School Saugus Public Schools

Decision Making Under Uncertainty. Professor Peter Cramton Economics 300

A simple analysis of the TV game WHO WANTS TO BE A MILLIONAIRE? R

Chapter 2 Discrete-Time Markov Models

Expression. Variable Equation Polynomial Monomial Add. Area. Volume Surface Space Length Width. Probability. Chance Random Likely Possibility Odds

MATH 140 Lab 4: Probability and the Standard Normal Distribution

4/1/2017. PS. Sequences and Series FROM 9.2 AND 9.3 IN THE BOOK AS WELL AS FROM OTHER SOURCES. TODAY IS NATIONAL MANATEE APPRECIATION DAY

Load Balancing and Switch Scheduling

Probability --QUESTIONS-- Principles of Math 12 - Probability Practice Exam 1

Combinatorics 3 poker hands and Some general probability

Performance Analysis of a Telephone System with both Patient and Impatient Customers

Answer: Quantity A is greater. Quantity A: Quantity B:

Probability and Expected Value

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

Using Algebra Tiles for Adding/Subtracting Integers and to Solve 2-step Equations Grade 7 By Rich Butera

Matrix Calculations: Applications of Eigenvalues and Eigenvectors; Inner Products

MAS108 Probability I

6.042/18.062J Mathematics for Computer Science. Expected Value I

Question 1 Formatted: Formatted: Formatted: Formatted:

Law 14 The Penalty Kick

EE 42/100 Lecture 24: Latches and Flip Flops. Rev B 4/21/2010 (2:04 PM) Prof. Ali M. Niknejad

Homework until Test #2

General Framework for an Iterative Solution of Ax b. Jacobi s Method

The Binomial Distribution

Math 431 An Introduction to Probability. Final Exam Solutions

2. Discrete random variables

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers)

LCAO-MO Correlation Diagrams

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

36 Odds, Expected Value, and Conditional Probability

How To Understand And Solve A Linear Programming Problem

CRLS Mathematics Department Algebra I Curriculum Map/Pacing Guide

STT 200 LECTURE 1, SECTION 2,4 RECITATION 7 (10/16/2012)

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

Lecture 2 Binomial and Poisson Probability Distributions

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100

Judi Kinney, Author Jo Reynolds, Illustrations and Graphic Design

Alessandro Birolini. ineerin. Theory and Practice. Fifth edition. With 140 Figures, 60 Tables, 120 Examples, and 50 Problems.

ECON 102 Spring 2014 Homework 3 Due March 26, 2014

What really drives customer satisfaction during the insurance claims process?

Transcription:

Practice Problems for Homework #8. Markov Chains. Read Sections 7.1-7.3 Solve the practice problems below. Open Homework Assignment #8 and solve the problems. 1. (10 marks) A computer system can operate in two different modes. Every hour, it remains in the same mode or switches to a different mode according to the transition probability matrix P = [ 0.4 0.6 0.6 0.4 a) Compute the 2-step transition probability matrix. b) If the system is in Mode I at 5:30 pm, what is the probability that it will be in Mode I at 8:30 pm on the same day? 2. (10 marks) The pattern of sunny and rainy days on the planet Rainbow is a homogeneous Markov chain with two states. Every sunny day is followed by another sunny day with probability 0.8. Every rainy day is followed by another rainy day with probability 0.6. a) Today is sunny on Rainbow. What is the chance of rain the day after tomorrow? b) Compute the probability that April 1 next year is rainy on Rainbow. 3. (10 marks) A computer device can be either in a busy mode (state 1) processing a task, or in an idle mode (state 2), when there are no tasks to process. Being in a busy mode, it can finish a task and enter an idle mode any minute with the probability 0.2. Thus, with the probability 0.8 it stays another minute in a busy mode. Being in an idle mode, it receives a new task any minute with the probability 0.1 and enters a busy mode. Thus, it stays another minute in an idle mode with the probability 0.9. The initial state is idle. Let X n be the state of the device after n minutes. a) Find the distribution of X 2. b) Find the steady-state distribution of X n. 4. (10 marks) A system has three possible states, 0, 1 and 2. Every hour it makes a transition to a different state, which is determined by a coin flip. For example, from state 0, it makes a transition to state 1 or state 2 with probabilities 0.5 and 0.5. 1

a) Find the transition probability matrix. b) Find the three-step transition probability matrix. c) Find the steady-state distribution of the Markov chain. 5. (10 marks) A certain machine is in one of four possible states: 0 = working without a problem; 1 = working but in need of minor repair; 2 = working but in need of major repair; 3 = out-of-order. The corresponding transition probability matrix is a) Verify that it is a regular Markov chain. b) Find the steady-state distribution. 0.80 0.14 0.04 0.02 0 0.6 0.3 0.10 0 0 0.65 0.35 0.90 0 0 0.10 6. (10 marks) Consider a game of ladder climbing. There are 5 levels in the game, level 1 is the lowest (bottom) and level 5 is the highest (top). A player starts at the bottom. Each time, a fair coin is tossed. If it turns up heads, the player moves up one rung. If tails, the player moves down to the very bottom. Once at the top level, the player moves to the very bottom if a tail turns up, and stays at the top if head turns up. a) Find the transition probability matrix. b) Find the two-step transition probability matrix. c) Find the steady-state distribution of the Markov chain. 7. (10 marks) (Sec 7.2, page 346, #2) We observe the state of a system at discrete points in time. The system is observed only when it changes state. Define X n as the state of the system after nth state change, so that X n = 0, if the system is running; X n = 1 if the system is under repair; and X n = 2 if the system is idle. Assume that the matrix P is: Compute the matrix P n for all possible n. 2

8. (10 marks) (Sec 7.3, page 355, #1) Consider a system with two components. We observe the state of the system every hour. A given component operating at time n has probability p of failing before the next observation at time n + 1. A component that was in a failed condition at time n has a probability r of being repaired by time n + 1, independent of how long the component has been in a failed state. The component failures and repairs are mutually independent events. Let X n be the number of components in operation at time n. The process {X n, n = 0, 1,...} is a discrete time homogeneous Markov chain with state space I = 0, 1, 2. a) Determine its transition probability matrix, and draw the state diagram. b) Obtain the steady state probability vector, if it exists. 3

Solutions: 1. a) P (2) = P P = [ 0.52 0.48 0.48 0.52 b) There are 3 transitions between 5:30 and 8:30, thus we need to compute p (3) 11. The 3-step transition probability matrix is [ 0.496... P (3) = P (2) P =...... (there is no need to compute the entire matrix). Hence, p (3) 11 = 0.496. 2. a) Let sunny be state 1 and rainy be state 2. Write the transition probability matrix, [ 0.8 0.2 0.4 0.6 We need to find the 2-step transition probability p (2) 12. The 2-step transition probability matrix is P 2 = [ 0.8 0.2 0.4 0.6 [ 0.8 0.2 0.4 0.6 = [ 0.28...... Only one element p (2) 12 needs to be computed. The probability that the day after tomorrow is rainy is 0.28. b. April 1 next year is so many transitions away that we can use the steady-state distribution. To find it, we solve the system of equations πp = π, π 1 + π 2 = 1. These equations are: 0.8π 1 + 0.4π 2 = π 1 0.2π 1 + 0.6π 2 = π 2 π 1 + π 2 = 1 From the first equation, π 1 = 2π 2. From the second equation, again π 1 = 2π 2. We know that one equation will follow from the others. Substituting π 1 = 2π 2 into the last equation, we get 2π 2 + π 2 = 1. From here, π 2 = 1/3 and π 1 = 2/3. Hence, the probability that April 1 next year is rainy is π 2 = 1/3. 4

3. The transition probability matrix is given as P = [ 0.8 0.2 0.1 0.9 Also, we have X(0) = 2 (idle mode) with probability 1, i.e., P 0 = (0, 1).. a) [ 0.8 0.2 P 2 = P 0 P 2 = (0, 1) 0.1 0.9 2 = (0.17, 0.83). Thus, X(2), the state after 2 transitions, is busy with probability 0.17 and idle with probability 0.83. b) To find the steady-state distribution, we solve the system of equations πp = π, π 1 +π 2 = 1. These equations are: 0.8π 1 + 0.1π 2 = π 1 0.2π 1 + 0.9π 2 = π 2 π 1 + π 2 = 1 From the first equation, π 2 = 2π 1. From the second equation, again π 2 = 2π 1. Substituting π 2 = 2π 1 into the last equation, we get π 1 + 2π 1 = 1. From here, π 1 = 1/3 and π 2 = 2/3. In a steady state, P {X n = busy} = 1/3, P {X n = idle} = 2/3. 4. a) P = 0.5 0 0.5 0.5 0.5 0 5

5. b) P 3 = 0.250 0.375 0.375 0.375 0.250 0.375 0.375 0.375 0.250 c) Solve the system πp = π along with the normalizing condition π 1 + π 2 + π 3 = 1. 0.5π 2 + 0.5π 3 = π 1 0.5π 1 + 0.5π 3 = π 2 0.5π 1 + 0.5π 2 = π 3 π 1 + π 2 + π 3 = 1 = = 1 π 1 = 2π 1 1 π 2 = 2π 2 = 1 π 3 = 2π 3 π 2 + π 3 = 2π 1 π 1 + π 3 = 2π 2 π 1 + π 2 = 2π 3 π 1 + π 2 + π 3 = 1 The steady-state probability distribution is π = (1/3, 1/3, 1/3). π 1 = 1/3 π 2 = 1/3 π 3 = 1/3 = a) Compute P 3 = It has no zeros, so this Markov chain is regular. 0.5678 0.2097 0.1501 0.0724 0.2295 0.2286 0.3554 0.1866 0.4882 0.0441 0.2872 0.1804 0.6732 0.1890 0.0936 0.0442 No desire to compute this matrix? You can also prove regularity by looking at the state transition diagram. Possible transitions are 1 1, 1 2, 1 3, 1 4; 2 2, 2 3, 2 4; 3 3, 3 4; 4 1, 4 4. So, we can follow the full circle 1 2 3 4 1 and reach any state from any other state in 3 transitions. If we reach the needed state too early, we just make an extra step from this state back to the same state. So, in 3 steps, all transitions are possible. The Markov chain is regular. b. Solve the system πp = π along with the normalizing condition π i = 1. The steady-state probability vector is π = (0.50, 0.18, 0.21, 0.11). 6

6. a) P = 0.5 0.5 0 0 0 0.5 0 0.5 0 0 0.5 0 0 0.5 0 0.5 0 0 0 0.5 0.5 0 0 0 0.5 b) P 2 = 0.5 0.25 0.25 0 0 0.5 0.25 0 0.25 0 0.5 0.25 0 0 0.25 0.5 0.25 0 0 0.25 0.5 0.25 0 0 0.25 c) Solve the system πp = π along with the normalizing condition π i = 1. 0.5(π 1 + π 2 + π 3 + π 4 + π 5 ) = π 1 π 0.5π 1 = π 1 = (0.5)(1) = 1/2 2 π 0.5π 2 = π 2 = (0.5)(π 1 ) = 1/4 3 = π 0.5π 3 = π 3 = (0.5)(π 2 ) = 1/8 4 π 0.5π 4 + 0.5π 5 = π 5 4 = (0.5)(π 3 ) = 1/16 π π 1 + π 2 + π 3 + π 4 + π 5 = 1 5 = π 4 = 1/16 π = (1/2, 1/4, 1/18, 1/16, 1/16) is the steady-state probability distribution vector. 7. Possible transitions are: 0 1 (with probability 0.5) 0 2 (with probability 0.5) 1 0 (with probability 1) 2 0 (with probability 1) This means: from states 1 and 2, the system surely goes to state 0. From state 0, it is equally likely to go to either 1 or 2. Thus, starting from 0, after any odd number of steps, it is at 1 or 2 7

with probabilities.5 and.5, and after any even number of steps, it is back at 0 with probability 1. Therefore, for even n we have, P n = and for odd n we have, P n = 8. a) The transition probability matrix is: (1 r) 2 2r(1 r) r 2 p(1 r) pr + (1 p)(1 r) r(1 p) p 2 2p(1 p) (1 p) 2 b) To compute the steady-state probabilities, one can of course solve the linear system. By doing this, one can see that the probabilities are: π 0 = [r/(p+r) 2, π 1 = 2[r/(p+r)[p/(p+ r) and π 2 = [p/(p + r) 2. One may also get this answer intuitively. Each component spends the average time of 1/p functioning without repairs and the average time of 1/r for each repair. Thus, the proportion of time it is functioning is (1/p)/(1/p + 1/r) = r/(p + r). That is, each time, independently of each other, each component is functioning with probability r/(p + r) and not functioning with probability p/(p + r). Hence, the limiting distribution of X n is Binomial with parameters 2 and r/(p + r). Check this! Take a vector of these Binomial probabilities and substitute into our linear system, see if it is correct. 8