My ultimate goal is to convert landcover raster (.tif) objects to an sf object representing the raster's grid and the original values of each cell within each geometry. I have been able to do this for smaller rasters doing the following:
library(sf)
library(stars)
# import raster using stars
landcover_stars <- read_stars(my_raster.tif)
# convert to sf object using st_as_sf
landcover_grid_sf <- st_as_sf(landcover_stars)
In larger rasters (e.g. my largest raster is currently 11482x12607 cells), however, the read_stars() function imports the raster as a "stars proxy", which is a step taken to handle large raster datasets by the package. While stars proxy objects are not accepted by the st_as_sf
function, it is possible to set "proxy = FALSE" in the function. If I do this in my largest dataset, however, running st_as_sf(landcover_stars)
with the resulting object will crash my laptop {16 GB RAM, i7 2.70GHz processor}.
Is there a way I can proceed to ease the load on my machine when converting very large star objects to sf?
In addition - could it be that it is actually the newly generated sf object what is depleting my machine?
Here is a dummy raster in case youd like to test it, with integer values randomly generated ranging from 1 to 10:
raster(nrows=12000, ncols=12000, xmn=0, xmx=10, vals = floor(runif(12000*12000, min=0, max=11)))