How to fix: "Error: cannot allocate vector of size 5.1 Gb" when running a spatial error model?

692 views Asked by At

I'm running a spatial error model on a large dataset (n=26,000) for a hedonic price analysis. I have built a nearest neighbor (k=10) spatial weights file and listw object. However, when I try running the actual "errorsarlm" function, I get the following error: "Error: cannot allocate vector of size 5.1 Gb". I suspect this has to do with the large spatial weights file that is being created, but I haven't found a way around it.

I have already tried: (1) Clearing out my global environment (2) Reducing the number of columns in my original data frame to the bare minimum (3) Reducing the number of nearest neighbors to 5 (4) Increasing my memory limit (with the function: memory.limit(size=56000))

step1_knn_CONDO20 <- knearneigh(cbind(CONDO20$POINT_X, CONDO20$POINT_Y), k=10) 
step2_nb_ONDO20 <- knn2nb(step1_knn_CONDO20) 
step3_listw_CONDO20 <- nb2listw(step2_nb_CONDO20) 
CONDO_SEM_17_TEST <- errorsarlm(tol.solve=1e-20, formula = saleamount_num18LOG ~ var1 + var2 + var3, data = CONDO20, step3_listw_CONDO20)
0

There are 0 answers