Lecture 12: More Uncertainty Economics 1011a: Intermediate Microeconomics Lecture 12: More on Uncertainty Thursday, October 23, 2008 Last class we introduced choice under uncertainty. Today we will explore this topic a bit more formally. We will introduce a crucial concept: risk aversion. 1011a Lecture 12 1 1011a Lecture 12 2 Some More Formal Notation Assume there are s states of the world. The probability of state 1 is p 1, the probability of state 2 is p 2, etc. Lotteries Any risk that you take in this world can be represented as a lottery. A lottery is defined by its payoff in each state. We sometimes represent these probability as a vector: For example, it might have payoff x 1 in state 1, x 2 in state 2, etc. Hence we can also represent this lottery as a vector: 1011a Lecture 12 3 1011a Lecture 12 4
Expected Value The expected value of this lottery is its mean; the amount that it pays you on average. Expected Utility (I) The expected utility of this lottery is the amount of utility that it gives you on average. This is just its payoff in each state times the probability of that state occurring: Again we use the expectations operator: We use the expectations operator E to denote this sum: 1011a Lecture 12 5 1011a Lecture 12 6 Expected Utility (II) Risk Aversion (I) Suppose that someone offers you a lottery with an expected value of zero. Note that the expected utility of a lottery depends on your initial wealth y. This means that rich and poor people may judge the same risk differently. This is called a fair bet. For example, I ll flip a coin. Heads you give me $10, tails I give you $10. 1011a Lecture 12 7 Would you take this bet? 1011a Lecture 12 8
Risk Aversion (II) What is the expected utility of this bet? How does this compare to not taking the bet? Risk Aversion and Concavity (I) Because of concavity, the utility gain when you win is smaller than the utility loss when you lose. u(y) 1011a Lecture 12 9 y-10 y y+10 1011a Lecture 12 10 Risk Aversion and Concavity (II) Hence by taking the bet you lose utility in expectation, even though in $s you don t. u(y) Jensen s Inequality (I) There is nothing special about this bet. In fact, Jensen s Inequality says that u(y) is strictly concave if and only if you would turn down every fair bet: y-10 y y+10 1011a Lecture 12 11 1011a Lecture 12 12
Jensen s Inequality (II) Certainty Equivalence (I) We can also write Jensen s Inequality as follows: What if you someone offers you a choice: 1)! You can play a lottery or This means you would prefer someone gives you the expected value of any bet upfront, rather than to actually take the bet. 2)! You can accept (or pay) a certain fixed amount of money c. Note that it does not matter if the bet has positive or negative expected value. 1011a Lecture 12 13 Which do you take? 1011a Lecture 12 14 Certainty Equivalence (II) Certainty Equivalence (III) Just compare the expected utilities of the two offers. This value c is called the certainty equivalent of the lottery. Remember that with the second there is no risk. For some value of c, these two are equal. Note that c depends on and y. Formally, we write is defined by: 1011a Lecture 12 15 1011a Lecture 12 16
Certainty Equivalence & Fair Bets The certainty equivalent of a fair bet is always negative. You will pay to avoid it. u(y) Wealth and Risk (I) We keep mentioning that your attitude towards risk depends on how wealthy you are. But how does it depend on this? y-10 y y+10 What might some reasonable assumptions be? 1011a Lecture 12 17 1011a Lecture 12 18 A Bet to Consider Risk Aversion Suppose I offer you the following bet: We flip a fair coin.! Heads I give you $1,000,000! Tails you give me $500,000 Would you take this bet? Would Bill Gates take this bet? It seems sensible that the rich will take bigger bets than the poor. This means that their risk aversion is decreasing. Formally you can define risk aversion as the amount you would pay to avoid small risks. It also turns out to be the curvature of the utility function. 1011a Lecture 12 19 1011a Lecture 12 20
Measuring Risk Aversion (I) The coefficient of absolute risk aversion (named by John Pratt) is defined as Measuring Risk Aversion (II) The coefficient of relative risk aversion is defined as This measures your how much you would pay to avoid bets that are small in dollars. This measures how much you would pay to avoid bets that are small as a percentage of your wealth. Both of these ideas are used extensively in finance. 1011a Lecture 12 21 1011a Lecture 12 22 Constant Absolute Risk Aversion What if you have constant absolute risk aversion (CARA)? Constant Relative Risk Aversion What if you have constant relative risk aversion (CRRA)? Solving this differential equation we have: Solving this differential equation we have: Does CARA seem reasonable? Note that u is bounded from above by 0. 1011a Lecture 12 23 or, for! = 1, Does CRRA seem more reasonable? 1011a Lecture 12 24
A special case of CRRA. Very easy to work with. What is strange about quadratic utility?. 1011a Lecture 12 25 1011a Lecture 12 26 With linear utility you are risk neutral. You will take any bet with positive expected value. Firms As an aside, remember we generally assume that firms only care about profits. This means firms utility is linear in income. So when a firm faces a risk, it only cares about the expected value of the risk. This means that a firm will take on any risk as long as it has positive expected value. 1011a Lecture 12 27 Is this reasonable? 1011a Lecture 12 28
Should You Take Good Bets? Suppose someone offers you a bet with positive expected value. Will you take it? What About Small Good Bets? However, what if you could take as small or large a piece of the bet as you wanted? This depends on the size of the bet. In other words, you could take the bet Even if, it could easily be that For any scalar k. Now will you bet (i.e. choose k > 0)? 1011a Lecture 12 29 1011a Lecture 12 30 How Big A Bet To Take? (I) How Big A Bet To Take? (II) Now you are not choosing between the binary options You are maximizing, for k " 0: Differentiate with respect to k: I claim k=0 is not a maximum. 1011a Lecture 12 31 1011a Lecture 12 32
How Big A Bet To Take? (III) Now plug in k=0 What if Income Changed? Note that here we were assuming that income y was the same in all states. If this was not true, then clearly this result would not necessarily hold. Since v (0) > 0, this is not a maximum. You can improve with a positive k. 1011a Lecture 12 33 For example, the bet might only pay off in a state where you already had lots of money. 1011a Lecture 12 34 Critiques of Expected Utility There is a lot of experimental evidence to show that people do not do maximize expected utility. The Allais Paradox (I) You can choose between 2 lotteries: Lottery A: Lottery B: In particular, people treat p=0 very differently from p=! (a small number). $1,000,000 (p = 1) Which do you pick? $5,000,000 (p = 0.10) $1,000,000 (p = 0.89) $0 (p = 0.01) 1011a Lecture 12 35 1011a Lecture 12 36
The Allais Paradox (II) What about these two: Allais and Expected Utility Theory (I) If you picked A over B, this means: Lottery C: $1,000,000 (p = 0.11) $0 (p = 0.89) Lottery D: $5,000,000 (p = 0.10) $0 (p = 0.90) But if you picked D over C, this means: Now which do you pick? 1011a Lecture 12 37 What s wrong with this? 1011a Lecture 12 38 Allais and Expected Utility Theory (II) The Ellsberg Paradox (I) I have two jars, each filled with 100 balls. Jar A: 50 Red Balls 50 Black Balls Jar B: x Red Balls 1-x Black Balls Why do you think people prefer A to B, but D to C? I will now pick a ball at random from 1 of the jars. 1011a Lecture 12 39 1011a Lecture 12 40
The Ellsberg Paradox (II) If I pick a red ball, you win $20. If I pick a black ball you get nothing. You get to choose which jar I draw from. The Ellsberg Paradox (III) OK, let s play again (same jars). Now if I pick a black ball, you win $20. If I pick a red ball you get nothing. Again you get to choose which jar I draw from. Who wants me to draw from Jar A? Now who wants me to draw from Jar A? What s going on here? 1011a Lecture 12 41 1011a Lecture 12 42