The purpose of this exercise is to practice determining indifference values, an important step in calculating utility. Work with a partner if possible.

What's worth more to you, 50 cents or a chance to win a dollar? What if you have to pay $5 for a chance to win $6? Opportunities for large payoffs often include chances of large losses -- for example, see the Pittsburgh Development Corporation's condominium development project of section 4.1.

In Chapter 4 we learned to use expected values to determine the best decision for a company. However, if the decision with the highest expected value comes with a risk of bankruptcy, it may be better to compute a "expected utility" which takes into account the fact that some losses are unacceptable.

When calculating the expected utility of a decision an analyst starts with information on the utility of the possible payoffs, then applies that to the specific outcomes possible for that decision. In this worksheet you learn where that utility information comes from by creating a utility table that describes how you personally value payoffs between $0.00 and $1.00. You then use that information to predict your preferences and "check your work".

**Please answer the questions below as honestly as you are able.**

Professor Burgiel offered to play a game with a student in the class. She gave the student $0.50; the student was then asked if he or she would like to spend that $0.50 on a 50/50 chance to win $1.00.

- If you were offered this decision, would you keep the $0.50 or
gamble on winning $1.00?
- If Professor Burgiel gave you $0.50, would you trade it back to
her for a 75% chance of winning $1.00?
- If Professor Burgiel gave you $0.50, would you trade it back to
her for a 25% chance of winning $1.00?
- Fill in the blank: I would keep the $0.50 unless the chance of
winning $1.00 was greater than or equal to:
_______%.

The gamble described above (win $1 or nothing) is called a **lottery** (see 3 (a), p. 159).
The probability you selected in the last question is the probability at which you are
**indifferent** about whether you gamble or keep the $0.50 (see 3 (b), p. 159); this is
your **indifference value of p** for this lottery and the value $0.50.
You will calculate your

U($0.50) = *p* U($1.00) + (1 - *p*) U($0.00) = *p* * 10.

The table we will use to describe your utility values for money is shown below. The values 10 and 0 were chosen to match the example in the book, and to make our calculations simpler.

Enter the percentage you found on the last page in the "Indifference Value" column to the right of $0.50. The remainder of this worksheet is dedicated to filling in the rest of the table, then checking it against an example.

Payoff | Indifference value of p | Utility Value |

$1.00 | n/a | 10 |

$0.75 | ||

$0.50 | ||

$0.25 | ||

$0.00 | n/a | 0 |

- Ask your partner if he or she would pay $0.75 to have a 50% chance
of winning $1.
- If your partner replies "yes", decrease the chance of winning -- would he or she pay $0.75 for a 25% chance (1 in 4) of winning $1.00?
- If your partner replies "no", increase the chance of winning -- would he or she pay $0.75 for a 75% chance (3 in 4) of winning $1.00?

- Continue to adjust the probability of winning until you have identified the percentage at which their answer changes from "yes" to "no". (For example, they might pay $0.75 for an 85% chance at $1.00 but not for an 80% chance, giving an indifference value of roughly 82.5% = .825.)
- Enter this indifference value in the table above, to the right of the value $0.75.

In theory, you now have a table describing how much money is worth to you when compared to a chance of winning $1.00. Is this description accurate? We'll find out by comparing the expected utility of two sample games.

Suppose a casino game gives you a 25% chance to win $0.75. Losing the casino game has a value of $0.00, winning has a value of $0.75. The expected value of the casino game is $0.1875. If you used only the expected value to make your decision, you would pay $0.18 to play the game but not $0.19.

EV = P(win) * (payoff of win) + P(loss) * (payoff of loss) = .25 * $0.75 + .75 * $0.00 = $0.1875

The **expected utility** of the casino game may provide a better estimate of
how much you would be willing to pay to play it:

EU = P(win) * (utility of win) + P(loss) * (utility of loss) = .25 * (utility of $0.75) + .75 * 0

- Use the utility value listed to the right of $0.75 in the table to find the expected utility (to you) of this game.
EU = .25 * (utility of $0.75) + .75 * 0 =

- Calculate the expected utility of a game in which you have
a 75% chance to win $0.25 (and receive nothing if you lose.) Please
show your work carefully.
- The two games described above (25% chance of $.75 and 75% chance of $.25) have the same expected value.
The game with the higher expected utility should be the one you'd prefer to play.
Would you actually prefer to play the game with the higher expected utility? If not, what might have gone wrong
in your calculations?

- Draw a graph below which has payoffs ($0 to $1) on the x-axis and utility (0 to 10) on the
y-axis. (If you wish, you may instead make a note here and attach a piece of graph paper or a
printout to the end of this worksheet.)
- Compare your graph to the one on page 164. Are you a risk taker, a risk avoider, risk neutral, or none of the above?