Conceptually, I need to multiply the probabilities of each event in a coincidence. Since there may be very many events involved, I have the computer add the logarithms to avoid underflow.
But suddenly I can't convince myself that I should initialize the return value to zero before I start adding. I know zero is the identity element for addition, and I remember this is how I do it, but, looking at a graph of the logarithm, I can clearly see that the antilog of zero is negative infinity.
So initializing the return value to zero should be equivalent to multiplying all my probabilities by negative infinity, which is definitely not correct. What am I doing wrong?
The antilog of zero is one, not negative infinity. That means that starting adding with zero for the logarithm is the same as starting multiplying with one for the probabilities themselves.