Gambling with Information Theory Govert Verkes University of Amsterdam January 27, 2016 1 / 22
How do you bet? Private noisy channel transmitting results while you can still bet, correct transmission(p) or error in transmission(q), with p q. p 1 1 q 2. N 2 / 22
Overview Kelly gambling Horse races Value of side information Entropy rate of stochastic processes Dependent horse races 3 / 22
John L. Kelly John Larry Kelly, Jr. (1923 1965) PhD in Physics Bell labs Shannon (Las Vegas) Warren Buffett (Investor) 4 / 22
Gambler with private wire Communication channel transmitting results Noiseless channel 1 1. V N = 2 N V 0 N N V N : Capital after N bets V 0 : Starting capital 5 / 22
Gambler with private wire Communication channel transmitting results Noiseless channel 1 1. V N = 2 N V 0 N N V N : Capital after N bets V 0 : Starting capital Exponential rate of growth 1 G = lim N N log V N V 0 5 / 22
Gambler with noisy private wire Exponential rate of growth p 1 1 q 2. N 1 G = lim N N log V N V 0 How would you bet on the received result? p : probability of correct transmission q : probability of error in transmission 6 / 22
Gambler with noisy private wire Exponential rate of growth p 1 1 q 2. N 1 G = lim N N log V N V 0 How would you bet on the received result? p : probability of correct transmission q : probability of error in transmission l : the fraction of gambler s capital that he bets V N = (1 + l) W (1 l) L V 0 6 / 22
Horse races Wealth relative S(X ) = b(x )o(x ) b(i) : fraction of gambler s wealth on horse i o(i) : o(i)-for-1 odds on horse i m : number of horses 8 / 22
Horse races Wealth relative S(X ) = b(x )o(x ) b(i) : fraction of gambler s wealth on horse i o(i) : o(i)-for-1 odds on horse i m : number of horses Wealth after N races (fraction) n S n = S(X i ) i=1 8 / 22
Horse races doubling rate Doubling rate m W (b, p) = E[log S(X )] = p i log b i o i i=1 p i : probability that horse i wins 9 / 22
Horse races doubling rate Doubling rate m W (b, p) = E[log S(X )] = p i log b i o i i=1 p i : probability that horse i wins Justification n 1 n log S n = 1 n n S n = S(X i ) i=1 i=1 log S(X i ) LLN E[log S(X )] nw (b,p) S n = 2 9 / 22
Horse races doubling rate Maximize doubling rate W (p) = max b: W (b, p) = max b i =1 p: b i =1 m p i log b i o i i=1 10 / 22
Horse races doubling rate Maximize doubling rate W (p) = b = p max b: W (b, p) = max b i =1 p: b i =1 m p i log b i o i i=1 10 / 22
Horse races doubling rate Maximize doubling rate W (p) = b = p max b: W (b, p) = max b i =1 p: b i =1 W (b) = p i log o i H(p) m p i log b i o i i=1 10 / 22
Example (CT 6.1.1) 3 horses with 3-for-1 odds how would you bet? p 1 = 1 2, p 2 = p 3 = 1 4 o 1 = o 2 = 3
Example (CT 6.1.1) 3 horses with 3-for-1 odds p 1 = 1 2, p 2 = p 3 = 1 4 o 1 = o 2 = 3 how would you bet? pi log o i H(p) = log 3 H( 1 2, 1 4, 1 4 ) = 0.085 S n = 2 n0.085 = (1.06) n 11 / 22
Example (CT 6.1.1) Odds are fair with respect to some distribution 1 o i = 1 and r i = 1 o i 12 / 22
Example (CT 6.1.1) Odds are fair with respect to some distribution 1 o i = 1 and r i = 1 o i W (b, p) = p i log b i p i p i r i = D(p r) D(p b) 12 / 22
Gambling with side information We have prior information Y Conditional doubling rate W (X ) = p i log o i H(p) W (X Y ) = max p(x, y) log b(x y)o(x) b(x y) x,y 13 / 22
Gambling with side information We have prior information Y Conditional doubling rate W (X ) = p i log o i H(p) W (X Y ) = max p(x, y) log b(x y)o(x) b(x y) x,y = p i log o i H(X Y ) 13 / 22
Gambling with side information We have prior information Y Conditional doubling rate W (X ) = p i log o i H(p) W (X Y ) = max p(x, y) log b(x y)o(x) b(x y) Increase in doubling rate x,y = p i log o i H(X Y ) W = W (X Y ) W (X ) = H(X ) H(X Y ) = I (X ; Y ) 13 / 22
Stochastic processes Sequence of random variables {X t } t T for discrete process T = N Pr(X 1, X 2,..., X n ) 14 / 22
Stochastic processes Sequence of random variables {X t } t T for discrete process T = N Pr(X 1, X 2,..., X n ) t T is more often than not interpreted as time Arbitrary dependence Pr(X n+1 X 1, X 2,..., X n ) 14 / 22
Stochastic processes properties Markov Pr(X n+1 X 1, X 2,..., X n ) = Pr(X n+1 X n ) Stationary Pr(X 1 = x 1, X 2 = x 2,..., X n = x n ) = Pr(X 1+t = x 1, X 2+t = x 2,..., X n+t = x n ) 15 / 22
Stochastic processes properties Markov Stationary Pr(X n+1 X 1, X 2,..., X n ) = Pr(X n+1 X n ) Pr(X 1 = x 1, X 2 = x 2,..., X n = x n ) = Pr(X 1+t = x 1, X 2+t = x 2,..., X n+t = x n ) Example: Simple random walk Stationary? Markov? Y = X n = { 1 with Pr: 1 2 1 with Pr: 1 2 n i=1 Y i 15 / 22
Entropy rate Definition 1 H(X ) = lim n n H(X 1, X 2,... X n ) 16 / 22
Examples X is i.i.d H(X 1, X 2,..., X n ) = nh(x 1 ) H(X ) = H(X 1 )
Examples X is i.i.d H(X 1, X 2,..., X n ) = nh(x 1 ) H(X ) = H(X 1 ) X independent but not identically distributed H(X 1, X 2,..., X n ) = n H(X i ) i=1 17 / 22
Entropy rate, related quantity Definition 1 H(X ) = lim n n H(X 1, X 2,... X n ) H (X ) = lim H(X n X 1, X 2,... X n 1 ) n For stationary processes H(X ) = H (X ) 18 / 22
Dependent horse races Horse race is dependent on past performance of horses {X n } : Sequence of horse race outcomes 19 / 22
Dependent horse races Horse race is dependent on past performance of horses {X n } : Sequence of horse race outcomes W (X n X n 1, X n 2,..., x 1 ) = log m H(X n X n 1, X n 2,..., X 1 ) W = log m H(X )S n = 19 / 22
Stock market X = (X 1, X 2,..., X n ) : Stock vector b = (b 1, b 2,..., b n ) : Investment vector (portfolio) S = b t X : Money gained after one day X F (x) : joint distribution of vector prices W (b, F ) = log b t x df (x) W (F ) = max b W (b, F ) 20 / 22
Conclusion Optimal betting strategy not always highest expected value 21 / 22
Conclusion Optimal betting strategy not always highest expected value Proportional betting is the way to go for fair odds 21 / 22
Conclusion Optimal betting strategy not always highest expected value Proportional betting is the way to go for fair odds Stock market interesting ([CT] chapter 15) 21 / 22
References Thomas M. Cover, Joy A. Thomas. "Elements of information theory" J. L. Kelly, Jr., "A New Interpretation of Information Rate" Questions? 22 / 22