Fourth Problem Assignment



Similar documents
Expected Value and the Game of Craps

ECE302 Spring 2006 HW4 Solutions February 6,

Elementary Statistics and Inference. Elementary Statistics and Inference. 17 Expected Value and Standard Error. 22S:025 or 7P:025.

Week 5: Expected value and Betting systems

Prediction Markets, Fair Games and Martingales

MA 1125 Lecture 14 - Expected Values. Friday, February 28, Objectives: Introduce expected values.

Chapter 4 Lecture Notes

A New Interpretation of Information Rate

Betting systems: how not to lose your money gambling

Midterm Exam #1 Instructions:

Expected Value. 24 February Expected Value 24 February /19

Elementary Statistics and Inference. Elementary Statistics and Inference. 16 The Law of Averages (cont.) 22S:025 or 7P:025.

Week 4: Gambler s ruin and bold play

This Method will show you exactly how you can profit from this specific online casino and beat them at their own game.

Probability and Expected Value

ECE 316 Probability Theory and Random Processes

The New Mexico Lottery

ECE302 Spring 2006 HW3 Solutions February 2,

We rst consider the game from the player's point of view: Suppose you have picked a number and placed your bet. The probability of winning is

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Section 7C: The Law of Large Numbers

Statistics 100A Homework 3 Solutions

arxiv: v1 [math.pr] 5 Dec 2011

Solutions: Problems for Chapter 3. Solutions: Problems for Chapter 3

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14

Decision Theory Rational prospecting

Inside the pokies - player guide

DEVELOPING A MODEL THAT REFLECTS OUTCOMES OF TENNIS MATCHES

In the situations that we will encounter, we may generally calculate the probability of an event

Introduction to the Rebate on Loss Analyzer Contact:

6.042/18.062J Mathematics for Computer Science December 12, 2006 Tom Leighton and Ronitt Rubinfeld. Random Walks

Lecture 25: Money Management Steven Skiena. skiena

THE POKIES: BEFORE YOU PRESS THE BUTTON, KNOW THE FACTS.

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

FACT A computer CANNOT pick numbers completely at random!

Beating Roulette? An analysis with probability and statistics.

SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION. School of Mathematical Sciences. Monash University, Clayton, Victoria, Australia 3168

THE ROULETTE BIAS SYSTEM

Practice Problems #4

Chapter 4. Probability Distributions

(SEE IF YOU KNOW THE TRUTH ABOUT GAMBLING)

Math 431 An Introduction to Probability. Final Exam Solutions

פרויקט מסכם לתואר בוגר במדעים )B.Sc( במתמטיקה שימושית

Chicago Booth BUSINESS STATISTICS Final Exam Fall 2011

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

During the course of our research on NBA basketball, we found out a couple of interesting principles.

2016 POKER TOURNAMENT CONTEST RULES

Stat 134 Fall 2011: Gambler s ruin

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Unit 19: Probability Models

YesFreeCash.com Free Bonus Hunting Tutorial For Beginners

How-To-Make-Money-Easily. Congratulations!!

Loss rebates. December 27, 2004

= = 106.

You can place bets on the Roulette table until the dealer announces, No more bets.

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

Presents ITALIAN QUALITY & DESIGN. New AR Side Bet

Would You Like To Earn $1000 s With The Click Of A Button?

Term Project: Roulette

Texas Hold em. From highest to lowest, the possible five card hands in poker are ranked as follows:

VISUAL GUIDE to. RX Scripting. for Roulette Xtreme - System Designer 2.0

Gambling with Information Theory

Lucky vs. Unlucky Teams in Sports

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

THE WHE TO PLAY. Teacher s Guide Getting Started. Shereen Khan & Fayad Ali Trinidad and Tobago

Chance and Uncertainty: Probability Theory

How to Beat Online Roulette!

5. Continuous Random Variables

Betting with the Kelly Criterion

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Basic Probability. Probability: The part of Mathematics devoted to quantify uncertainty

BREAK INTO FASHION! JOKER BILL CASH SYMBOL

cachecreek.com Highway 16 Brooks, CA CACHE

$ ( $1) = 40

THE WINNING ROULETTE SYSTEM by

Lectures on Stochastic Processes. William G. Faris

The Mathematics of Gambling

Roulette Best Winning System!!!

3.2 Roulette and Markov Chains

MONEY MANAGEMENT. Guy Bower delves into a topic every trader should endeavour to master - money management.

16. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION

Mathematical Expectation

Diamond Games Premium VI Game Description Revision 1.0 WS

Learn How to Use The Roulette Layout To Calculate Winning Payoffs For All Straight-up Winning Bets

6.042/18.062J Mathematics for Computer Science. Expected Value I

Worksheet for Teaching Module Probability (Lesson 1)

Easy Casino Profits. Congratulations!!

Random Variables. Chapter 2. Random Variables 1

How to Gamble If You Must

Lecture 6: Discrete & Continuous Probability and Random Variables

MrMajik s Money Management Strategy Copyright MrMajik.com 2003 All rights reserved.

THE WINNING ROULETTE SYSTEM.

Property of Bet-U-Make-Profit.com

calculating probabilities

Steal From the Casino! Roulette Sniper

Transcription:

EECS 401 Due on Feb 2, 2007 PROBLEM 1 (25 points) Joe and Helen each know that the a priori probability that her mother will be home on any given night is 0.6. However, Helen can determine her mother s plan for the night at 6 P.M., and then, at 6:15 P.M., she has only one chance each evening to shout one of two code words across river to Joe. He will visit her with probability 1.0 if he thinks Helen s message means Ma will be away, and he will stay home with probability 1.0 if he thinks the message means Ma will be home. But Helen has a meek choice and the river is channeled for heavy barge traffic. Thus she is faced with the problem of coding for a noisy channel. She has decided to use a code containing only the code words A and B. The channel is described by P(a A) = 2 3, P(a B) = 1 4, P(b A) = 1 3, P(b B) = 3 4 where a is the event that Joe think message is A and b is the event that Joe thinks message is B. (a) In order to minimize the probability of error between transmitted and received messages, should Helen and Joe agree to use code I or code II? Code I A = Ma away B = Ma home Code II A = Ma home B = Ma away Using Code I Pr(error) = Pr(A, b) + Pr(B, a) = Pr(A) Pr(b A) + Pr(B) Pr(a B) = 0.4 1 3 + 0.6 1 4 = 0.2833 Using Code II 1 Due on Feb 2, 2007

s Pr(error) = Pr(A, b) + Pr(B, a) = Pr(A) Pr(b A) + Pr(B) Pr(a B) Thus Joe and Helen should code I = 0.6 1 3 + 0.4 1 3 = 0.3 (b) Helen and Joe put the following cash values (in dollars) on all possible outcomes of a day Ma home and Joe comes -30 Ma home and Joe doesn t come 0 Ma away and Joe comes +30 Ma away and Joe doesn t come -5 Joe and Helen make their plans with the objective of maximizing the expected value of each day of their continuing romance. Which of the above codes will maximize the expected cash value per day of this romance? Using code I Using code II E[value] = Pr(A, a)(30) + Pr(A, b)( 5) + Pr(B, a)( 30) + Pr(B, b)(0) = 0.4 2 3 30 0.4 1 3 5 0.6 1 30 = 2.833 4 E[value] = Pr(A, a)( 30) + Pr(A, b)(0) + Pr(B, a)( 5) + Pr(B, b)(30) = 0.6 1 3 30 0.4 1 4 5 + 0.4 3 30 = 2.5 4 Thus to maximize the expected cash value of their romance, Joe and Helen should use code I. (c) Clara isn t quite so attractive as Helen, but at least she lives on the same side of the river. What would be the lower limit of Clara s expected value per day which would make Joe decide to give up Helen? 2 Due on Feb 2, 2007

s Clara s expected value per day must be at least $2.833 for Joe to decide to give up Helen. (d) What would be the maximum rate which Joe would pay the phone company for a noiseless wire to Helen s house which he could use once per day at 6:15 P.M.? Then, Suppose Helen uses the telephone line. Let M be the event that Ma is home. E[value] = 0 Pr(M) + 30 Pr(M ) = 12 Thus Joe will be willing to give $(12-2.84) = $9.16. (e) How much is it worth to Joe and Helen to double her mother s probability of being away from home? Would this be a better or worse investment than spending the same amount of money for a telephone line (to be used once a day at 6:15 P.M.) with the following probabilities. P(a A) = P(b B) = 0.9, P(b A) = P(a B) = 0.1 Suppose that the probability of Ma begin away from home is doubled to 0.8. Then using code I, Using code II E[value] = Pr(A, a)(30) + Pr(A, b)( 5) + Pr(B, a)( 30) + Pr(B, b)(0) = 0.8 2 3 30 0.8 1 3 5 0.2 1 30 = 13.17 4 E[value] = Pr(A, a)( 30) + Pr(A, b)(0) + Pr(B, a)( 5) + Pr(B, b)(30) = 0.2 1 3 30 0.8 1 4 5 + 0.8 3 30 = 15 4 Thus, if Ma s probability of being away is doubled, Joe and Helen will use code II with an expected value of $15. Thus, they will be willing to give $(15-2.84) = $12.16. The telephone line is symmetric, so both codes will have the same expected value of (asumming A is the event Ma is home E[value] = Pr(A, a)( 30) + Pr(A, b)(0) + Pr(B, a)( 5) + Pr(B, b)(30) = 0.6 0.1 30 0.4 0.1 5 + 0.4 0.9 30 = 8.8 This is less than the expected return when Ma is away with a probability of 0.8. Thus spending in doubling Ma s probability from being away from home is better. 3 Due on Feb 2, 2007

s PROBLEM 2 (18 points) Hypothesis testing May B. Lucky is a compulsive gambler who is convinced that on any given day she is either lucky, in which case she wins each red/black bet she makes in roulette with probability p L > 0.5, or she is unlucky, in which case she wins each red/black bet she makes in roulette with probability p U < 0.5. May visits the casino every day, and believes that she knows the a priori probability that any one given visit is a lucky one (i.e., corresponds to p L rather than p U ). To improve her chances, May adopts a system whereby she estimates on-line whether she is lucky or unlucky on a given day, by keeping a running count of the number of bets that she wins and loses. In particular, she continues to play until the conditional odds in favor of the event lucky on the current day} given the number of wins and losses so far, fall below a certain threshold. As soon as this happens, she stops playing. Provide a simple algorithm for updating May s conditional odds with each play. Note that if A and B are events with P(A) > 0 and P(B) > 0, the odds in favor of A given B are defined as O(A B) = P(A B) P(A c B). Let A be the event that May is lucky on the current day and let B n,m be the event that n wins and m losses have occurred so far. Assume independence of the results of different spins/plays. Let p be the a prior probability that she is lucky on the current day. Then we have, O(A B n,m ) = Pr(A B n,m) Pr(A c B n,m ) = Pr(A) Pr(B ( n+m ) n,m A) Pr(A c ) Pr(B n,m A c ) = n p n L (1 p L ) m ) p n U (1 p U ) m = ( pl p U ) n ( 1 pl 1 p U ) m p 1 p ( n+m m From this formula, a recursive algorithm can be obtained. Let q(n + m) be the odds after n + m games. Then q(n + m + 1) = q(n + m) p L p U, q(n + m) 1 p L 1 p U, if she wins the next play if she loses the next play The initial condition is O(A B 0,0 ) which is equal to the initial (unconditional) odds O(A), which May knows by assumption. PROBLEM 3 (12 points) Fischer and Spassky play a sudden-death chess match whereby the first player to win a game wins the match. Each game is won by Fischer with probability p, by Spassky with probability q, and is a draw with probability 1 p q. 4 Due on Feb 2, 2007

s (a) What is the probability that Fischer wins the match? Let E be the event that Fischer wins the match. We can express E as E = n 0 E n where E n is the event that each of the first n games is a draw and the (n + 1)th game is won by Fischer. Since E n s are disjoint, we have Pr(E) = Pr(E n ) = (1 p q) n p = p p + q. n 0 n 0 (b) What is the PMF, the mean, and the variance of the duration of the match? The PMF of D is given by p D (d) = (1 p q) d 1 (p + q), d = 1, 2, 3.... Since the duration D of the match is a geometric random variable with parameter p + q, we obtain E[D] = 1 p + q, and var(d) = 1 p q (p + q) 2. 5 Due on Feb 2, 2007

s PROBLEM 4 (24 points) When you push the SEND button on your cell phone, the phone attempts to set up a call by transmitting a SETUP message to a nearby base station. The phone waits for a response and if none arrives within 0.5 seconds it tries again. If it doesn t get a response after N tries the phone stops transmitting and generates a busy symbol. Assume that all transmissions are independent and that with probability p the SETUP message will get through. Also assume that if the SETUP message gets through the response from the base station is always correctly received by the cell phone within 0.5 seconds. (a) What is the PMF of X, the number of times the SETUP message is transmitted in a call attempt? X can take values 1, 2,..., N. The PMF is given by p(1 p) x 1, x = 1,2,...,(N-1) p X (x) = (1 p) N 1 x = N 0 otherwise (b) What is the probability that the call will generate a busy signal? We get a busy signal if all N attempts are unsuccessful. Thus Pr(BUSY) = (1 p) N (c) Assuming that there is no limit on the number of tries, i.e., your phone will keep transmitting the SETUP message indefinitely until it gets through, what is the PMF of X, the number of transmissions in a call attempt? If the number of trials are unlimited, the PMF is given by p X (x) = p(1 p) x 1, x = 1,2,3,... 0 otherwise (d) Following the previous part, what is the expected number of transmissions of the SETUP message in a call attempt? Notice that this is the geometric distribution with parameter p. Thus, E[X] = 1 p 6 Due on Feb 2, 2007

s PROBLEM 5 (12 points) Let X be a random variable with PMF p X (a) Suppose g is a one-to-one function; i.e., g(x) g(y) if x y. Show that E(g(X)) = x g(x)p X (x). (1) Assume X takes n values x 1, x 2,..., x n. Define a r.v. Y = g(x), which also takes n distict values y 1 = g(x 1 ), y 2 = g(x 2 ),..., y n = g(x n ). By definition E[Y] = y i p Y (y i ) (2) But event Y = y i } = X = x i } since g is a one-to-one mapping, thus i=1 p Y (y i ) = Pr(Y = y i }) = Pr(X = x i }) = p X (x i ) Pluggin the above result into (2), we have E[Y] = y i p Y (y i ) = i=1 g(x i )p X (x i ). (b) Suppose now that g is general and can be many-to-one. Show that (1) still holds. In this case, the event Y = y l } = i=1 m l k=1 1, 2,..., m l are disjoint and g(x kl ) = y l, k = 1, 2,..., m l. Thus p Y (y l ) = Pr(Y = y l }) = Substituting this in (2) above, we have m l k=1 X = xkl }, where X = xkl }, k = Pr(X = x kl }) = m l k=1 p X (x kl ). E[Y] = y l p Y (y l ) = m l l=1 m l g(x kl )p X (x kl ), l=1 Since x kl are distinct for all possible l, k, the summation just enumerates all the x i, i = 1, 2,..., n and therefore we have E[Y] = g(x i )p X (s i ) i=1 7 Due on Feb 2, 2007

s PROBLEM 6 (9 points) The annual premium of a special kind of insurance starts at $1000 and is reduced by 10% after each year where no claim has been filed. The probability that a claim is filed in a given year is 0.05, independently of preceding years. What is the PMF of the total premium paid up to and including the year when the first claim is filed? A claim is first filed for the first time in year n with probability (0.05) (0.95) n 1, and the corresponding total premium is 1000 (1 + 0.9 +... + (0.9) n 1) = 1000 1 (0.9)n 1 0.9 = 10000 (1 (0.9) n ). Thus the PMF of X, the total premium paid up to and including the year when the first claim was filed, is p X (x) = 0.05 (0.95) n 1, if x = 10000 (1 (0.95) n ), n = 1, 2,... 0, otherwise 8 Due on Feb 2, 2007