- “This paper presents a critique of expected utility theory as a descriptive model of decision making under risk, and develops an alternative model, called prospect theory.”
- Problems with utility theory are that people generally underweight probable as opposed to certain outcomes
- This is called the
*certainty effect* - Makes people
*risk averse*in terms of gains and*risk seeking*in terms of loss

- This is called the
- People also “…discard components that are shared by all prospects under consideration.” <the decision DAG is left as a tree and branches aren’t merged>
- Called the
*isolation effect* - “… leads to inconsistent preferences when the same choice is presented in different forms.”

- Called the
- “An alternative theory of choice is developed, in which value is assigned to gains and losses rather than to final assets and in which probabilities are replaced by decision weights. The value function is normally concave for gains, commonly convex for losses, and is generally steeper for losses than for gains. Decision weights are generally lower than the corresponding probabilities except in the range of low probabilities.”
- Overweighing of low probabilities is given as a potential reason why people gamble and buy insurance.
- Utility theory (rational decision making) is often right, but not always. This paper highlights some of the special instances where actual behavior is inconsistent with utility theory
- In this paper they restrict consideration to cases where payoff is 0 or
*x*, based on a fixed probability (potentially probability 1) - In most of classical economics there is an included stipulation to the pure utility theory that makes people risk averse – that is something with a deterministic payoff
*x*is considered better than something with an expected payoff of*x* - Depending on the problem, individuals violate utility theory (for the same issue – stochasticity) in two opposite ways depending on whether the probabilities are large or small
- The standard fix on utility theory was to penalize any stochasticity (because sometimes people don’t like it) but sometimes it actually makes the option seem better than the actual utility, not worse

- This happens even if parts of the problem (the rewards) are non-numeric, with some prizes being strictly better (has a superset of the goods) of another prize
- On rule proposed is that
- (
*y , pq*) = (*x , p*) ⇒ (*y , pq*) preferred to (**r***x , p*)**r**

- (
- Being risk averse when potential rewards is positive is accompanied by being risk seeking when the potential outcomes are penalties
- This is called the
*reflection effect* - For example, people are indifferent to (100, 0.65; -100, 0.35) or (0, 0.0)

- This is called the
- “… it appears that certainty increases the aversiveness of losses as well as the desirability of gains.”
- <Ending here – no time!>