I have two BigDecimal
s and I'd like to determine if they are close based on their significant digits.
For example, consider the following:
BigDecimal million = new BigDecimal(1_000_000);
BigDecimal tenmillion = new BigDecimal(10_000_000);
BigDecimal a = new BigDecimal("55.89").multiply(million);
BigDecimal b = new BigDecimal("55.88").multiply(million);
BigDecimal c = new BigDecimal("55.88").multiply(tenmillion);
I would like to be able to determine that a
and b
are close (and that c
is not) because they are only one significant digit off, even though in raw numbers they are separated by thousands. What's the best way to determine this?
For example, one could replace the least significant digit with 0
, however that would make 55.81 == 55.89
while leaving 55.89 != 55.90
so that's obviously not actually a correct solution.
Take the maximum difference you will allow and also multiply it by multiplier before using it as the range.
You need to know the precision to do anything useful here. After all what happens if you are comparing 13450000 to 13450? They are very different but if you just strip the zeroes then they look the same. If you look at the significant differences on both numbers then 1234000 compared to 1235000 would say true but 1234500 compared to 1235000 would say false even though the difference is actually smaller in the second case.
Maybe what you should be doing here is looking at a % difference? In other words divide one number by the other and look at how close that result is to 1.