Answers to Exercises in Chapter 5 - Markov Processes

Similar documents
Master s Theory Exam Spring 2006

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

Tenth Problem Assignment

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

LECTURE - 1 INTRODUCTION TO QUEUING SYSTEM

Name: Math 29 Probability. Practice Second Midterm Exam Show all work. You may receive partial credit for partially completed problems.

Analysis of a Production/Inventory System with Multiple Retailers

LECTURE 4. Last time: Lecture outline

3.2 Roulette and Markov Chains

Chapter 13 Waiting Lines and Queuing Theory Models - Dr. Samir Safi

1 Gambler s Ruin Problem

29. o clock? Does he go to a bar with his friends

Optimal Hiring of Cloud Servers A. Stephen McGough, Isi Mitrani. EPEW 2014, Florence

Random access protocols for channel access. Markov chains and their stability. Laurent Massoulié.

The Exponential Distribution

Introduction. Chapter 1

1/3 1/3 1/

University of Missouri. Travel & Expense System FAQ

Practice Problems for Homework #8. Markov Chains. Read Sections Solve the practice problems below.

Performance Analysis of a Telephone System with both Patient and Impatient Customers

Lectures on Stochastic Processes. William G. Faris

Markov Chains for the RISK Board Game Revisited. Introduction. The Markov Chain. Jason A. Osborne North Carolina State University Raleigh, NC 27695

Queuing Theory II 2006 Samuel L. Baker

Algebra II EOC Practice Test

Making Inferences Picture #1

LAB 06: Impulse, Momentum and Conservation

Binomial lattice model for stock prices

Reinforcement Learning

Introduction to Markov Chain Monte Carlo

The Central Idea CHAPTER 1 CHAPTER OVERVIEW CHAPTER REVIEW

Reinforcement Learning

PARAGRAPH ORGANIZATION 1 Worksheet 1: What is an introductory paragraph?

How to Communicate With Your Divorce Attorney, Reduce Your Level of Stress, And Actually Save Money

ISyE 2030 Test 2 Solutions

Exam Introduction Mathematical Finance and Insurance

Lean Six Sigma Analyze Phase Introduction. TECH QUALITY and PRODUCTIVITY in INDUSTRY and TECHNOLOGY

Business Statistics, 9e (Groebner/Shannon/Fry) Chapter 9 Introduction to Hypothesis Testing

Section 6.1 Discrete Random variables Probability Distribution

INTEGRATED OPTIMIZATION OF SAFETY STOCK

CHAPTER 7 STOCHASTIC ANALYSIS OF MANPOWER LEVELS AFFECTING BUSINESS 7.1 Introduction

Lesson 3: Constructing Circle Graphs. Selected Content Standards. Translating Content Standards into Instruction

ASSESSING AIRPORT PASSENGER SCREENING PROCESSING SYSTEMS

MATHEMATICS FOR ENGINEERING BASIC ALGEBRA

Simple Queuing Theory Tools You Can Use in Healthcare

Psychology and Economics (Lecture 17)

The problem with waiting time

Diablo Valley College Catalog

Statistics in Retail Finance. Chapter 6: Behavioural models

FIREFIGHTER APPLICANT APTITUDE TEST OVERVIEW

RECOMMENDED PRACTICE 1701h

Markov Chains. Chapter Stochastic Processes

M/M/1 and M/M/m Queueing Systems

Lab 11. Simulations. The Concept

Waiting Times Chapter 7

Performance Workload Design

DISTANCE TIME GRAPHS. Edexcel GCSE Mathematics (Linear) 1MA0

Monte Carlo Methods in Finance

Financial Mathematics and Simulation MATH Spring 2011 Homework 2

Blood Sugar Testing. Bayer Health Facts

Chapter. The Weekend

On the mathematical theory of splitting and Russian roulette

How To Balance In A Distributed System

Homework set 4 - Solutions

Continuous Random Variables

CHAPTER 3 CALL CENTER QUEUING MODEL WITH LOGNORMAL SERVICE TIME DISTRIBUTION

Eurusdd s TZ/RZ Clues & Lessons

From the probabilities that the company uses to move drivers from state to state the next year, we get the following transition matrix:

by Dimitri P. Bertsekas and John N. Tsitsiklis Last updated: October 8, 2002

Hydrodynamic Limits of Randomized Load Balancing Networks

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

Opgaven Onderzoeksmethoden, Onderdeel Statistiek

AP Stats - Probability Review

Math 728 Lesson Plan

A capacity planning / queueing theory primer or How far can you go on the back of an envelope? Elementary Tutorial CMG 87

Notes on Continuous Random Variables

1 Interest rates, and risk-free investments

An Essential Guide to Less Stress, More Time, and Achieving Goals Rhonda Manns, College Admissions Consultant

Server Load Prediction

Modeling and Performance Evaluation of Computer Systems Security Operation 1

Average System Performance Evaluation using Markov Chain

Understanding Options: Calls and Puts

7.5 Emphatic Verb Tense

Stochastic modeling of a serial killer

2015 China Heritage Summer Tour

Time Management. What is Time?

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

BARCELONA. Planning Basketball Trainings Tour (3 days / 2 nights)

Math Quizzes Winter 2009

Discrete-Event Simulation

Decision Theory Rational prospecting

Statistics 100A Homework 7 Solutions

STATISTICS 8: CHAPTERS 7 TO 10, SAMPLE MULTIPLE CHOICE QUESTIONS

Transcription:

M. J. Roberts - //8 Answers to Exercises in Chapter 5 - Markov Processes 5-. Find the state transition matrix P for the Markov chain below..4..8...6..4.6... P =..8 5-. In a discrete-time Markov chain, there are two states and. When the system is in state it stays in that state with probability.4. When the system is in state it transitions to state with probability.8. Graph the Markov chain and find the state transition matrix P..4..8.6.4.6 P =.8. 5-. An instrumentation system data-acquisition system acquires a block of data from a each of two redundant temperature sensors every 5 ms. Then sensors are numbered and. Each 5 ms period the computer scans the data from one of the two sensors for the presence of data spikes caused by impulse electromagnetic interference (EMI). If impulse EMI is present, the computer rejects the data from that sensor and examines the data from the other sensor. If the other sensor also has impulse EMI the computer waits through the next two data acquisition periods without taking data and then starts acquiring data again from first sensor. Graph this system using the probabilities of impulse EMI below and write the state transition matrix P. Impulse EMI Sensor Probability.5.8.95.5.9.8 Solutions 5-

M. J. Roberts - //8.95.5.9.8 P = 5-4. Find the n-step transition matrix P n for the Markov chain of Exercise 5-..4..8.6.6 n (.4) n.8 +.6.7 P n =.74 ( (.4) ) n.8.6 +.8.4 n 5-5. A marksman is shooting at a target. Every time he hits the target his confidence goes up and his probability of hitting the target the next time is.9. Every time he misses the target his confidence falls and he hit the target the next time with probability.6. Over a long time what is his average success rate in hitting the target?.857 5-6. Find the stationary probability vector for the Markov chain of Exercise 5-..4..8...6. T =.797.9.6574.9 5-7. A random-walk Markov chain has K positions. The state transitions are p, i = j = p, j = i + ; i =,, K P ij = p, i = j = K p, j = i ; i =,, K, otherwise Sketch the Markov chain and find the stationary probability vector. p p p p -p... K p -p -p -p -p Done as example -7 in Markov Chain notes for ECE 54 (below). ( ) ( ) K + p / p i = p / p Solutions 5- p p. i

M. J. Roberts - //8 5-8. At an airport security checkpoint there are a metal detector, an explosive residue detector and a very long line of people waiting to get checked through. (Assume the line is so long we can consider it infinitely long for this exercise. Also assume that a person may not enter the metal detector until there is no one waiting to enter the explosive detector.) Every traveler must go through these two checks, metal detector first, followed by the explosive detector. Assume that the number of seconds required for the metal detector is geometrically distributed with an expected time of seconds and that the number of seconds required for the explosive detector is also geometrically distributed with an expected time of seconds. Graph the Markov chain describing the states of the metal detector and explosive detector. (a) (b) What is the probability that each detector is in use? What is the probability that both detectors are simultaneously in use? The stationary probability that the metal detector is in use is.4 +.985 =.48, The stationary probability that the explosive detector is in use is 67 +.985 =.8657, The stationary probability that both detectors are in use is.985. 5-9. Repeat Exercise 5-8 under the assumption that each detector is equally likely to finish in exactly seconds or exactly seconds. What is the probability that both detectors are busy? 5/7 5-. Consider an irreducible Markov chain. Prove that if the chain is periodic, then P ii = for all states i. Is the converse also true? If P ii = for all i, is the irreducible chain periodic? The converse is not true. 5-. Consider a discrete random walk with state space {,,,}. In state the system can remain in state with probability p or go to state with probability p. In states i >, the system can go to state i with probability p or to state i + with probability p. Sketch the Markov chain and find the stationary probabilities. -p p -p p p -p -p... i = p p p p i, p <. If p, the limiting state probabilities do not exist. 5-. A hotel reservation service has a toll-free number people can call to make reservations. All their reservation clerks are sick except one, who is taking all calls. At the end of each one-second interval the customer currently receiving service completes his/her reservation process with probability q, independent of the amount of time he/she has been on the line. Otherwise he/she stays on the line. Also, at the end of each one-second interval a new customer calls the reservation number with probability p independent of the number of people already waiting for service and the amount of time the current customer has been on the line and, if the reservation clerk is busy, is put on hold in a first-in-first-out queue until the reservation clerk is available. Let the number of people waiting to be served be the state of the system and graph the Markov chain. What are the stationary probabilities, if they exist, and what are the conditions under which they exist? = q p q and i = q p q p q p p q q p i, i >, p < q Solutions 5-

M. J. Roberts - //8 If p q, the series does not converge and the stationary probabilities do not exist. 5-. A graduate student spends his time doing four things in the following sequence.. Eating (breakfast) - expected time is minutes. Doing experimental research - expected time is 5 hours. Eating (lunch) - expected time is minutes 4. Doing experimental research - expected time is 5 hours 5. Eating (dinner) - expected time is minutes 6. Analyzing research results - expected time is 5 hours 7. Sleeping - expected time is 8 hours The time spent in each activity is exponentially distributed. Graph a Markov chain for this student s activity and find the stationary probabilities. Eating Research /8 /5 Sleeping 6 Eating /5 Analysis 5 4 Eating /5 Research T p = / 7 5 / 4 / 7 5 / 4 / 7 5 / 4 / 5-4. Let an M / M / c / c queue have c = 5 servers. How large is the load = / μ which makes the probability of rejecting an arrival be.5?.85 λ λ λ λ λ... c- c μ μ μ (c-)μ 5-5. Find the limiting state distribution of the M / M //c queue that has one server and capacity c. i = i c+ 5-6. Everyone has had the experience of waiting in one of several lines for service and watching the other lines move faster. Sometimes the lines are combined into one line and as each server completes a transaction the next person in line goes to that server. To keep this exercise as simple as possible suppose there are two people selling basketball game tickets and the time to sell a ticket is exponentially distributed with a service rate of μ people per minute and an arrival rate of people per minute cμ (a) Let there be two lines. Each person chooses a line at random and once a person gets into a line he must stay in that line. What is the expected number of people waiting in line in terms of the load? / (b) Let there be one line and as a ticket seller becomes available the person at the head of the line goes to that seller. What is the expected number of people waiting in line in terms of the load? Solutions 5-4

M. J. Roberts - //8 / 5-7. A door-to-door saleslady is selling cosmetics. Her self-confidence depends on the recent history of her sales, being higher after making a sale and lower after losing a sale. When her self-confidence is higher her probability of making a sale at the next house goes up and when it is lower her probability of making a sale at the next house goes down. Suppose she starts the day fresh with a probability of. of making a sale. Every time she makes a sale her probability of making a sale at the next house increases by.5 except that it never exceeds and every time she loses a sale her probability of making a sale at the next house decreases by.5 except that it never falls below.. Graph the Markov chain for this saleslady with state representing the initial state when she starts in the morning, negative state numbers representing lower selling probability and positive state numbers representing a higher selling probability. Over a long period of time (many days) what is the probability of making a sale at the seventh house of the day?.9 Solutions 5-5