I have a total 25000 image links. I am trying to download these images to my local using nodejs request package. up to 14000 to 15000 it is downloading after that I am getting below errors.


{ Error: socket hang up
    at TLSSocket.onHangUp (_tls_wrap.js:1148:19)
    at Object.onceWrapper (events.js:313:30)
    at emitNone (events.js:111:20)
    at TLSSocket.emit (events.js:208:7)
    at endReadableNT (_stream_readable.js:1064:12)
    at _combinedTickCallback (internal/process/next_tick.js:139:11)
    at process._tickCallback (internal/process/next_tick.js:181:9)
  code: 'ECONNRESET',
  path: null,
  host: 'factory.jcrew.com',
  port: 443,
  localAddress: undefined }
      throw er; // Unhandled stream error in pipe.

Error: EMFILE: too many open files, open 'C:\sangram\fiverr\New folder\public\JCREWFCT\99105154564.png'

Code for download

var request = require('request');

var download = async function (uri, filename, callback) {
  await request.head(uri, function (err, res, body) {
    request(uri).pipe(fs.createWriteStream(filename)).on('close', callback);
    if(err) {

await download(d.image_link_1, saveDir, function () {

can anyone guide me what I need to do to download these may files at a time?

1 Answers

RKalra On

The main problem I see is that there is an EMFILE error which should be handled appropriately by queuing up open and readdir calls. Use graceful-fs for that.

The other problem that you might run into with such large number of async requests has to do with pooling and allowed maxSockets. If that happens, set the pooling to false or set the maxSockets parameter appropriately. Or, instead of setting maxSockets on request, you can set the http.globalAgent.maxSockets, like:

var https = require('https');
https.globalAgent.maxSockets = 30000;

Check out: https://github.com/request/request