Since standing at the point of sale in the supermarket yesterday, once more trying to heuristically find an optimal partition of my coins while trying to ignore the impatient and nervous queue behind me, I've been pondering about the underlying algorithmic problem:
Given a coin system with values v1,...,vn, a limited stock of coins a1,...,an and the sum s which we need to pay. We're looking for an algorithm to calculate a partition x1,...,xn (with 0<=xi<=ai) with x1*v1+x2*v2+...+xn*vn >= s such that the sum x1+...+xn - R(r) is maximized, where r is the change, i.e. r = x1*v1+x2*v2+...+xn*vn - s and R(r) is the number of coins returned from the cashier. We assume that the cashier has an unlimited amount of all coins and always gives back the minimal number of coins (by for example using the greedy-algorithm explained in SCHOENING et al.). We also need to make sure that there's no money changing, so that the best solution is NOT to simply give all of the money (because the solution would always be optimal in that case).
Thanks for your creative input!
If I understand correctly, this is basically a variant of subset sum. If we assume you have 1 of each coin (
a[i] = 1
for eachi
), then you would solve it like this:Then find the first
k >= s
andsum[k]
istrue
. You can get the actual coins used by keeping track of which coin contributed to eachsum[j]
. The closest you can get your sum tos
using your coins, the less the change will be, which is what you're after.Now you don't have 1 of each coin
i
, you havea[i]
of each coini
. I suggest this:It should be fairly easy to get your
x
vector from this. Let me know if you need any more details.