jensen shannon divergence is always inf

232 views Asked by At

I have used spacy for text (sentence) embedding where ensemble type is mean and embedding dimension is 300. Here I want to find jensen shannon divergence between two sentences.

from scipy.spatial import distance
a = np.matrix([[-3.28254097, -1.88733395,  0.45432703, ..., -3.11750075,-0.67481474,  0.16454482]])
b = np.matrix([[-3.14694506, -4.70102323,  1.50877541, ..., -1.04084573,0.60134094,  1.34008962]])
distance.jensenshannon(a, b, axis=1)

shape of both a & b is (1, 300)

I always get divergence as inf. I have tested for different pair of sentences and result is always same. I have tested with fist N elements and again divergence is inf. Looks like it is because elements are negative number so what is work around I can compute valid JS Drift in such case.

Thanks in advance

0

There are 0 answers