Making a case for Dynamic Programming: The Change Problem
Let’s look at the greedy approach to solving the Change Problem.
We'll cover the following...
Changing money greedily
Imagine that you bought an item in a bookstore for $69.24, which you paid for with $70 in cash. You’re due 76 cents in change, and the cashier now must make a decision whether to give you a fistful of 76 1-cent coins or just four coins (25 + 25 + 25 + 1 = 76). Making change in this example is easy, but it casts light on a more general problem: how can a cashier make change using the fewest number of coins?
Different currencies have different possible coin values, or denominations. In the United States, the coin denominations are (100, 50, 25, 10, 5, 1); in the Roman Republic, they were (120, 40, 30, 24, 20, 10, 5, 4, 1). The heuristic used by cashiers all over the world to make change, which we call GreedyChange, iteratively selects the largest coin denomination possible.
GreedyChange(money)
change ← empty collection of coins
while money > 0
coin ← largest denomination that is less than or equal to money
add a coin with denomination coin to the collection of coins change
money ← money − coin
return change
STOP and Think: Does GreedyChange always return the minimum possible number of coins?
Say we want to change 48 units of currency (denarii) in ancient Rome. GreedyChange returns five coins (48 = 40 + 5 + 1 + 1 + 1), and yet we can make change using only two coins (48 = 24 + 24). Thus, GreedyChange is suboptimal for some denominations!
STOP and Think: During the reign of Augustus, Roman coin denominations were changed to (1600, 800, 400, 200, 100, 50, 25, 2, 1). Why did these denominations make Roman cashiers’ lives easier? More generally, find a condition on coin denominations that dictates when GreedyChange will make change with the fewest number of coins.
A different approach
Since GreedyChange is incorrect, we need to devise a different approach. We can represent coins from d arbitrary denominations by an array of integers:
...