In an economic context a decision involves risk when it can lead to more than one outcome, when those outcomes have different values and when the probabilities of those outcomes are known or can be estimated.

The economic definition of risk

So the example of the coin flip gamble involves risk because there are two outcomes one outcome leads to more money than the other and the probabilities of those outcomes are known. This economic definition doesn’t exactly match all the ways in which we as risk in daily conversation.

For example we describe real-world behaviors is risky if we might get hurt doing, things like skydiving, and we describe her investments is risky we might lose money, like purchase stocks, but a decision can be risky even if nothing really bad can happen.

Suppose you’re about to purchase a home and are deciding whether to lock in a particular interest rate for your mortgage. You can lock in a low interest rate now or you can wait to see if a better rate appears in the coming weeks. Your rate might be pretty good regardless but you still face a choice between a safer option or a riskier one.

Risk and ambiguity

Risk is also different from ambiguity. I said that a decision involves risk when the probabilities of its outcomes are known or can be estimated. We know, for example, the probabilities of flipping heads or tails. We can estimate the probabilities of getting lower or higher interest rates based on recent fluctuations in mortgage rates.

When we can’t know or can’t estimate the probability of an outcome then the decision is said to involve not risk, but ambiguity. Many interesting real-world decisions are thought to involve ambiguity but that’s a topic for a different article.
For many sorts of economic decisions we can describe people’s choices of being risk-averse, risk neutral, or risk seeking.

Risk-averse choices

Most common are risk-averse choices. that is where people prefer a safe option over a risky option with equal expected value. To understand why let’s return to the idea of a utility function. A utility function that determines how much subjective value we get from different amounts of an objective quantity like money.

It was recognized long before the advent of behavioral economics that people subjective value for money and for most other things show diminishing marginal utility. In traditional models this was described in terms of wealth states a $ worth more to someone who has nothing than to someone who has $1 million already.

Prospect theory moved away from wealth states but the idea of diminishing marginal utility. The difference between gaining one dollar and gaining two dollars is more than the difference between gaining $1000 and gaining $1001. This sort of utility function leads to risk aversion at least as a broad principle .

Diminishing marginal utility

To understand why, consider the schematic example: which would you rather have a sure $1000 or a coin flip gamble between zero dollars in $2001? Intuitively this seems like an easy choice to most people: take the sure thousand dollars. But why is that choice so easy?

It’s not because the thousand dollars is a better choice, at least by expectation, if anything you’d expect to earn very slightly more on average choosing the risky option. This is an easy choice because of diminishing marginal utility. The subjective difference between zero and $1000 seems much larger than us than the subject of difference between thousand dollars in $2001.

So we choose the safer option and avoid the chance of getting zero dollars. Therefore provided that someone shows diminishing marginal utility for money than they will be naturally risk-averse.