Read in large CSV file in R and export as multiple RData files using number of rows and skip

759 views Asked by At

I'm attempting to import and export, in pieces, a single 10GB CSV file with roughly 10 million observations. I want about 10 manageable RData files in the end (data_1.RData, data_2.Rdata, etc.), but I'm having trouble making the skip and nrows dynamic. My nrows will never change as I need almost 1 million per dataset, but I'm thinking I'll need some equation for skip= so that every loop it increases to catch the next 1 million rows. Also, having header=T might mess up anything over ii=1since only the first row will include variable names. The following is the bulk of the code I'm working with:

for (ii in 1:10){
      data <- read.csv("myfolder/file.csv", 
                         row.names=NULL, header=T, sep=",", stringsAsFactors=F,
                         skip=0, nrows=1000000)
      outName <- paste("data",ii,sep="_")
      save(data,file=file.path(outPath,paste(outName,".RData",sep="")))

    }
1

There are 1 answers

2
A5C1D2H2I1M1N2O1R2T1 On

(Untested but...) You can try something like this:

nrows <- 1000000
ind <- c(0, seq(from = nrows, length.out = 10, by = nrows) + 1)
header <- names(read.csv("myfolder/file.csv", header = TRUE, nrows = 1))

for (i in seq_along(ind)) {
  data <- read.csv("myfolder/file.csv", 
                   row.names = NULL, header = FALSE, 
                   sep = ",", stringsAsFactors = FALSE,
                   skip = ind[i], nrows = 1000000)
  names(data) <- header
  outName <- paste("data", ii, sep = "_")
  save(data, file = file.path(outPath, paste(outName, ".RData", sep = "")))
}