I'm requesting a big file with 150K records but it throws 'toString failed' error. nodejs/node#3175 says it's because of maxBufferSize. Request works well for 200 records but its an external api and the requirement is to get all records at once.[no pagination there :( ] Is there any way to set buffersize for this request?
I already asked this question here
EDIT:
request("http://www.site-containing-big-data/api",
function (error, response, body) {
console.log('got something to show');
if(!error && response.statusCode == 200) {
resolve(body);
}else if(error){
reject(error);
}
});
but nothing shows in console other than toString failed
message
solved it. It was an xml file and I was trying to process it directly. Now I'm first saving in a file and then using
xml-stream
to process each object one by one.