Until recently I have been using rpy2 to load an R package into a Jupyter notebook running a Python kernel, and use it to read some data from a sqlserver database. A few days ago, I started seeing this error:
import rpy2.robjects as ro
R[write to console]: Error: cons memory exhausted (limit reached?)
Following this error the Python kernel crashes and is restarted by Jupyter. I would like to understand this error, what is causing it and why it has appeared when the same code ran without errors previously.
This answer, on r-devel, suggests that it is an R error produced by exceeding a maximum number of objects:
[Rd] How to debug: Cons memory exhausted
Other than this, there seems to be very little information available about this error.
The error occurs on the first line in my code. This, and the fact that the code ran previously, makes me think something is being cached from a previous sessions. I can import the data in R (using RStudio), so I think this is specific to Jupyter/rpy2 rather than a general R issue. It is not specific to this particular notebook though - if I run the above import
in another notebook, on a different kernel, I get the same error. So my questions are:
- Is an excess of R Objects the only thing that causes this error, or might it be something else?
- Is this likely a caching problem or am I on the wrong track here?
- Where might these objects be cached? (and how can I safely clear them?)
I am using jupyterlab in a conda venv on Windows 10 in a remote Amazon workspace. I have no admin privileges. Here are some versions:
import session_info
session_info.show()
-----
rpy2 3.4.4
session_info 1.0.0
-----
Click to view modules imported as dependencies
-----
IPython 7.16.1
jupyter_client 6.1.6
jupyter_core 4.6.3
jupyterlab 2.1.5
notebook 6.0.3
-----
Python 3.7.7 (default, May 6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)]
Windows-10-10.0.14393-SP0
-----
Searching SO reveals many ways in which importing robjects can fail, but I can only find one about this specific error (with no answers given):
Tangentially, it seems as though this question might be germane:
Is it safe to manually delete all files in pkgs folder in anaconda python?
Is it recommended to clear out $HOME\AppData\Local\conda\conda\pkgs\cache
regularly? (UPDATE: There were some quite large files in this cache folder but clearing them did not solve the issue).
UPDATE: I see the same error in Spyder, running outside the venv, so this isn't specific to jupyter.
UPDATE: I can import rpy2. A simple import rpy2
produces no error.
UPDATE:
import rpy2.situation as rps
for row in rps.iter_info():
print(row)
gives:
rpy2 version:
3.4.4
Python version:
3.7.7 (default, May 6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)]
Looking for R's HOME:
Environment variable R_HOME: None
InstallPath in the registry: C:\Program Files\R\R-3.6.1
Environment variable R_USER: None
Environment variable R_LIBS_USER: None
R version:
In the PATH:
Loading R library from rpy2: OK
Additional directories to load R packages from:
None
C extension compilation:
Warning: Unable to get R compilation flags.
Environment variables are not set in rpy2. Could this mean that R_NSIZE
is unset, and could this cause the error?
I have fixed this by adding system environment variables: R_HOME, R_USER, and Path. [windows10, R 4.1.0, rpy2 3.4.5, python 3.8]