Uploading to S3 with Node/Knox : socket hang up

1.8k views Asked by At

I'm trying to get() files and stream them to my s3 bucket, could someone tell me what I'm doing wrong?

My code :

var client = knox.createClient({
    key: 'AAAAAAAAAAAA',
    secret: 'BBBBBBBBBBBB',
    bucket: 'my-imgs',
});
var elem = list.shift(); // {_id:'filename.jpg', main_img: 'http://example.com/file.jpg'}
  request.get(elem.main_img,function(err,res){
      var headers = {
          'Content-Length': res.headers['content-length'],
          'Content-Type': res.headers['content-type'],
          'x-amz-acl': 'public-read'
      };
      console.log(headers) //outputs are ok, error comes after
      var req = client.putStream(res, elem._id, headers,function(err,s3res){
        if(err) console.log(err);
        console.log(s3res);
      });
  }).on('error', function(err) {
    console.log(err)
  });

The headers object is filled properly, the request doesn't error anything, and after a few seconds I get :

{ [Error: socket hang up] code: 'ECONNRESET' }

Do I have to configure my bucket in a particular way for it to accept transfers? I've just created it through the AWS console, in 'US standard' and have done only one thing, adding a poicy that I tought would allow upload. I have the feeling that it may be the problem, but I have no idea as of how to fix it, and all the tutorials I see are very outdated, please give me a clue!

The policy :

{
    "Statement": [
        {
            "Sid": "allow-public-read",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::my-imgs/*"
        },
        {
            "Sid": "allow-public-put",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": "s3:PutObject",
            "Resource": "arn:aws:s3:::my-imgs/*"
        }
    ]
}
2

There are 2 answers

1
Hitu Bansal On

You can try this in ur app.js

var http = require('http');

//increase the max socket limit
http.globalAgent.maxSockets = 1024;
0
ezabaw On

Last week I was struggling with this same issue. Gladly I found the solution: in order of it to work, you need to end your S3 request:

var req = client.putStream(res, elem._id, headers,function(err,s3res){
    if(err) console.log(err);
    console.log(s3res);
});
req.end();

Or Simply:

var req = client.putStream(res, elem._id, headers,function(err,s3res){
    if(err) console.log(err);
    console.log(s3res);
}).end();