HiveBrain v1.2.0
Get Started
← Back to all entries
gotchaModerate

Why does the order of the nested loops matter when solving the Coin Change problem?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
coinproblemwhytheordersolvingloopsnesteddoeswhen

Problem

The Coin Change problem is stated as:

Given an integer array coins[ ] of size N representing different
denominations of currency and an integer sum, find the number of ways
you can make sum by using different combinations from coins[ ].

Note: Assume that you have an infinite supply of each type of coin.

So for example using coins {1, 2, 3} and a desired sums of 3, yields combinations (1,1,1), (1,2) and (3).

This problem can be solved using dynamic programming where the idea is to iteratively count the
the number of combinations that can amount to a partial sum:

std::vector combinations_to_consume_sum(sum + 1, 0);
    combinations_to_consume_sum[0] = 1; // if we got here, we consumed everything using a single coin
               
    for (int i = 0; i = 0)
            {
                combinations_to_consume_sum[partial_sum] += combinations_to_consume_sum[leftover_sum];
            }
        }
    }
    return combinations_to_consume_sum[sum];


This is a working solution. However, if I swap the order of the 2 nested loops, the algorithm is no longer correct:

for (int partial_sum = 1; partial_sum = 0)
            {
                combinations_to_consume_sum[partial_sum] += combinations_to_consume_sum[leftover_sum];
            }
        }
    }
    return combinations_to_consume_sum[sum];


Can someone offer an explanation as to why this approach only works when using a 'coins first' approach?

Solution

In the second solution you are overcounting the number of combinations.
In fact you are counting the of ways to choose an ordered list of coins that sums to sum.

For example suppose that you have coins of values $1$ and $2$.

  • There is only one way to obtain a sum of $0$, namely using no coins and the algorithm computes this correctly.



  • There is only one way to obtain a sum of $1$, and the algorithm computes this correctly (by "using" coin $1$ and looking at the number of ways to use $0$ coins to obtain a sum of $0$).



  • There are two ways to obtain a sum of $2$, namely using two $1$ coins or a single $2$ coin. The algorithm also happens to compute this correctly by summing the $1$ way to obtain a sum of $1$ (when the $1$ coins is used) with the $1$ way to obtain a sum of $0$ (when the two coin is used).



  • There are two ways to obtain a sum of $3$, namely using three $1$ coins or a $1$ coin and a $2$ coin. The algorithm fails here as it considers:



  • The two ways to obtain a sum of $2$ (after the $1$ coin is used). These are $(1,1)$ and $(2)$. This case accounts for the following ways to obtain a sum of $3$: $(1,1,1)$, and $(1,2)$.



  • The one way to obtain a sum of $1$ (after the $2$ coin is used). This is $(1)$. This case accounts for the following way to obtain a sum of $3$: $(2,1)$.



As you can see, the option of using a $2$ coin and a $1$ coin is counted twice (in the two different possible orders).

Context

StackExchange Computer Science Q#154455, answer score: 11

Revisions (0)

No revisions yet.