# Uncertainty and Risk

## Certainty Equivalence and Risk Premia

### Deal or No Deal?

The game show “Deal or No Deal” had a simple premise: there are 26 briefcases, each with a different amount of money ranging from a penny to a million dollars. The contestant chooses one briefcase at random. As the game goes on, the contents of the other briefcases are revealed, providing updated information as to the likely contents of the case held by the contestant. Each round, with new information revealed, a mysterious “banker” calls and offers an amount of money, which the contestant can accept (“deal”) or not (“no deal”). If they accept the banker’s offer, they get that amount of money for sure. If they don’t, they continue to play the game.

On September 1, 2008, Jessica Robinson was the contestant. As cases were revealed, the million-dollar value was still on the board – until the very end, when only two cases remained. One had $\$200$,000, and the other had $\$1$,000,000. The banker offered her $\$561$,000. She refused:

Fundamentally, her choice was to consume a lottery of (200K with probability $\frac{1}{2}$, 1000K with probability $\frac{1}{2}$) or to consume 561K with certainty. The expected value of the lottery was 600K (halfway between 200K and 1000K) – so why did the banker offer her 561K?

It makes sense that a risk-averse individual would accept *some* value less than the expected value of the lottery. Because they’re risk averse, we know that they would strictly prefer the expected value of the lottery for sure to facing the risk of the lottery; so it follows that there is some value less than the expected value which they would accept. We call this the *certainty equivalent*.

### Certainty Equivalence

Up to now we’ve been thinking of the expected utility from a lottery in which consumption is different in different states of the world: this is an *uncertain* outcome. However, we can also consider *certain* outcomes: that is, bundles in which consumption is the same in all states of the world: that is, $c_1 = c_2$. Visually, this is a 45-degree line in $c_1-c_2$ space, which we might call the “Line of Certainty.”

The point where the indifference curve passing through a lottery intersects the line of certainty is interesting. This is occurs at the (common) value of consumption known as the **certainty equivalent****certainty equivalent**: the certain consumption that yields the same utility as an uncertain lottery: that is, the amount of money which, if you had it for sure, would give you the same amount of utility as the lottery.

For example, consider a lottery which pays $c_1 = 16$ and $c_2 = 64$ with equal probability $(\pi = \frac{1}{2})$, and suppose the utility function is $u(c) = \sqrt{c}$. Then the expected utility of that lottery is

$$\textcolor{#e6550d}{\mathbb{E}[u(c)] = \frac{1}{2}\sqrt{16} + \frac{1}{2}\sqrt{64} = 6}$$

The certainty equivalent of that lottery would be the amount $CE$ that would give the same utility: $\textcolor{#e6550d}{\sqrt{CE} = 6}$, or $\textcolor{#e6550d}{CE = 36}$. In other words: having $CE = 36$ for sure would yield the same utility as having an equal chance of $c_1 = 16$ and $c_2 = 64$.

Visually, we can see this in our two graphs. In the left graph, we can see that the brown dot is at the coordinates $(\mathbb E[c], \mathbb E[u(c)]) = (40,6)$, as before. The orange dot directly to the left of the brown dot represents the certainty equivalent: that is, its coordinates are $(CE, u(CE)) = (36,6)$. In the right graph, we can see that the point (36,36) lies at the point where the indifference curve passing through (16,64) intersects the 45-degree line $c_1 = c_2$:

Note that if you move the $r$ slider to $r = 1$, so the agent is risk neutral, that $CE = \mathbb E[c] = 40$: in other words, you are indifferent between the lottery (16 with probability $\frac{1}{2}$, 64 with probability $\frac{1}{2}$) and having 40 with certainty.

### Risk Premium

The fact that for a risk-averse agent, the certainty equivalent is less than the expected value of the lottery (i.e., $CE < \mathbb E[c]$), has an economic implication: the difference between the CE and the expected value of the lottery is called the **risk premium****risk premium**: difference (in dollars) between the expected value of the lottery and an agent’s certainty equivalent.

You can think about the risk premium as the amount the agent would be willing to pay to avoid risk – that is, to buy the expected value of the lottery. In the example above, the CE is 36 and the expected value of the lottery is 40. So you can think of the agent as being willing to pay up to 4 to have 40 for sure. (In other words, if they paid 4 to have 40 with no risk, they would end up with 36 for sure, which would give them utility of 6.)