Latent Dirichlet Allocation on Sparse Matrix (

458 views Asked by At

I'm trying to run topic modeling using lda 1.0.2 package on Python.

My input is a sparse matrix of class 'scipy.sparse.csr.csr_matrix'.

It currently seems like this doesn't work, and raises the following error:

lda_class.fit(data) Traceback (most recent call last): File "", line 1, in File "C:\Users\gw\AppData\Local\Continuum\Anaconda3\lib\site-packages\lda\lda.py", line 120, in fit self._fit(X) File "C:\Users\gw\AppData\Local\Continuum\Anaconda3\lib\site-packages\lda\lda.py", line 214, in _fit self._initialize(X) File "C:\Users\gw\AppData\Local\Continuum\Anaconda3\lib\site-packages\lda\lda.py", line 257, in _initialize w, d = WS[i], DS[i] IndexError: index 0 is out of bounds for axis 0 with size 0

Does anyone have any idea what the problem is and how to approach it?

0

There are 0 answers