R Object Occupies 4.5 Times as Much Memory to Load, as is its Size

105 views Asked by At

I'm trying to load a binary decision tree into R; but I get the error: Error: cannot allocate vector of size 2.0 Mb. The file size is actually 900 MB

enter image description here

My PC's specs are:

Processor Intel(R) Core(TM) i5-8265U CPU @ 1.60GHz 1.80 GHz

Installed RAM 8.00 GB (7.85 GB usable)

System type 64-bit operating system, x64-based processor

Pen and touch No pen or touch input is available for this display

With my working environment fully loaded, it's telling me I'm only using 600 B of memory. Even though when I clear it with the same syntax, it manages to find 1.87 GB worth of stuff to erase.

> object_size(list = ls())
600 B
> mem_change(rm(list = ls()))
-1.87 GB
> object_size(list = ls())
48 B

With at least 1.87 GB of unused memory in my working environment, I attempt to load in the 900 MB file:

>   load("testtree.RData")
Error: cannot allocate vector of size 2.0 Mb

In its failed attempt, the used memory ballooned up to over 4 GB. Here's the before:

enter image description here

And here's what it looks like while trying to load:

enter image description here

My background processes are taking up 30-40% of memory. I've read that I can buy more RAM, but I'd rather optimize the use of what I already have.

Why does it take R 4.6 GB to open a 900 MB file? What is the actual limit to my working environment? Is there a coding solution to loading this file?

0

There are 0 answers