Lecture 11: Choice Under Uncertainty Economics 1011a: Intermediate Microeconomics Lecture 11: Choice Under Uncertainty Tuesday, October 21, 2008 Last class we wrapped up consumption over time. Today we will begin studying consumption (and investment) with risk. This is like consumption over time, only we are now worried about future uncertainty. 1011a Lecture 11 1 1011a Lecture 11 2 What Is Uncertainty? One could imagine approaching this question in many ways. We have a very general formulation for it in economics. Uncertainty means any situation in which there is a probability of more than one event occurring. States of the World More formally, we think of this each of these events as a different state of the world. Before the uncertainty is resolved, we do not know which state we are going to end up in. These states of the world are mutually exclusive. After the uncertainty is resolved, we are in one and only one of the states. 1011a Lecture 11 3 1011a Lecture 11 4
How Many States? States of the World: Examples In the real world there are practically infinitely many potential states of the world. Every event with an uncertain resolution (e.g. the flip of a coin) splits the world into more different states. Usually we will be dealing with very simple sets of states:! I roll a die. There are 6 states of the world, 1 for each face of the die.! 2 states of the world: your house burns down or it doesn t. Luckily we can ignore events that don t affect us. If I only care about that flip of the coin, I can just consider two states: heads and tails. 1011a Lecture 11 5 You need to make sure that your set of states exhaust all possible events.! Not acceptable: the Yankees or the Cardinals win the World Series.! Acceptable: the Yankees win the World Series or they do not. 1011a Lecture 11 6 States and Probability Choice Under Uncertainty Now we will think of their being n different states of the world, s 1,s 2,...,s n. Each state has a probability of occurring: p 1,p 2,...,p n. We must have Suppose you have a choice to make that has different payoffs in different states of the world.! How much fire insurance to buy.! Whether or not to play the lottery.! Whether or not to bring an umbrella to class. The theory we have developed so far does not cover this. We need a way of modeling utility maximization in the context of future events. 1011a Lecture 11 7 1011a Lecture 11 8
The Basic Problem Suppose there are two states of the world, a and b. State a has probability p of occurring, and state b has probability (1-p) of occurring. Expected Utility (I) There are many possible ways you could judge this situation in the present moment. The most general formulation would be Your wealth will be y a in state a, and y b in state b. Utility will be u(y a ) and u(y b ). How do you judge the utility of this situation before the resolution of uncertainty? 1011a Lecture 11 9 However, the standard assumption is that people maximize expected utility: 1011a Lecture 11 10 Expected Utility (II) Expected Utility vs. Expected Value (I) With expected utility (EU), the weight you place on each state is exactly its probability of occurring. EU is the utility you get on average. An advanced result called the expected utility theorem shows that, given certain reasonable axioms, this must be what people do. Expected utility is not the same thing as expected value (from statistics). Expected value is the expected monetary payoff before the event happens. In this example, it would be 1011a Lecture 11 11 1011a Lecture 11 12
Expected Utility vs. Expected Value (II) What if expected utility was always the same as expected value? Then we would always have But this means Expected Utility vs. Expected Value (III) In this case, utility is just linear in income. But we always assume utility is concave. So we never have EU = EV. More on this next lecture. 1011a Lecture 11 13 1011a Lecture 11 14 An Example: Buying Insurance (I) Suppose there is a bad event (e.g. a fire) with a probability p of occurring. If this bad event occurs, you will lose d dollars. If your current wealth is w, what is your expected utility? An Example: Buying Insurance (II) fire loss no fire Now suppose someone is selling insurance against this event happening. How much insurance should you buy? This depends on the price of the insurance! 1011a Lecture 11 15 1011a Lecture 11 16
How Insurance Works $1 of insurance costs q (q<1). Assume you buy x dollars of insurance. Expected Utility with Insurance So this is your expected utility if you buy x dollars of insurance: The cost is qx. This is your insurance premium. You pay this no matter what. premium fire insurance payoff premium If the fire occurs, the insurer gives you x dollars. If it doesn t, you get nothing. Now you just maximize this with respect to x. 1011a Lecture 11 17 1011a Lecture 11 18 Solving the Insurance Problem (I) Solving the Insurance Problem (II) We can use first order conditions: Let us rearrange this: 1011a Lecture 11 19 1011a Lecture 11 20
Solving the Insurance Problem (III) On the left hand side, the numerator is marginal utility of consumption in the bad state u (y b ), and the denominator that in the good state u (y g ). What Price Insurance? So the answer to this question depends on q (the price of insurance). How much do you think insurance should cost? Let us assume that insurance companies are making zero profits on average. Why might this be reasonable? 1011a Lecture 11 21 1011a Lecture 11 22 Actuarially Fair Insurance Solving With Fair Insurance How much profit does the insurance company expect to make on $1 of insurance? Now let us plug q=p into our FOC: premium payoff after fire no payoff otherwise So at zero profits q = p. This is called actuarially fair insurance. What does this mean? 1011a Lecture 11 23 1011a Lecture 11 24
Full Insurance Expensive Insurance (I) What if q > p? Then insurers make a profit. Income is exactly the same in both states. This means the consumer has fully insured against risk. Plugging in for income: 1011a Lecture 11 25 How did we get the last step? 1011a Lecture 11 26 Expensive Insurance (II) Since, you are not fully insuring against risk. So x < d. Cheap Insurance (I) What if q < p? Not very realistic, as then insurers are losing money. But even though insurance is expensive, it s probably worth it to buy some. The more expensive, the less you buy. We will explore this more next time. 1011a Lecture 11 27 Strange... 1011a Lecture 11 28
Cheap Insurance (II) Here you are actually over-insuring. You want your house to burn down! Since q < p buying insurance is like taking a bet with a positive expected payoff. The insurance company s loss is your gain. So after you fully insure, you happily take (some of) this bet on top of the insurance. 1011a Lecture 11 29