Principal Component Analysis - why eigenvectors' dot product aren't zeros?

866 views Asked by At

I try to do Principal Component Analysis of the breast_canser dataset using Python sklearn. And can't understand why both dot products (3 components) of eigenvectors aren't zeros?

frst = pca.components_[0,:]
scnd = pca.components_[1,:]
thrd = pca.components_[2,:]
orth1 = np.dot(frst,scnd)
orth2 = np.dot(scnd, thrd)
print(orth1.real)
print(orth2.real)

out:

0.0

1.52655665886e-16

1

There are 1 answers

0
Skam On

Floating point arithmetic isn't always 100% accurate since computers use a finite amount of digits to represent a number with infinite digits. 1.52655665886e-16 ~ machine epsilon the upper bound on relative error due to floating point operations, so I'd count it as 0.

EDIT: You could also run into this issue if your matrix doesn't have distinct eigenvalues.