There is a pretty nice example available for uploading large files to s3 via aws-sdk-js library but unfortunately this is using nodeJs fs.
Is there a way we can achieve the same thing in Plain Javascript? Here is a nice Gist as well which breaks down the large file into the smaller Chunks however this is still missing the .pipe functionality of nodeJs fs which is required to pass to asw-sdk-js upload function. Here is a relevant code snippet as well in Node.
var fs = require('fs');
var zlib = require('zlib');
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
on('httpUploadProgress', function(evt) {
console.log('Progress:', evt.loaded, '/', evt.total);
}).
send(function(err, data) { console.log(err, data) });
Is there something similar available in Plain JS (non nodeJs)? Useable with Rails.
Specifically, an alternative to the following line in Plain JS.
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
The same link you provided contains an implementation intended for the Browser, and it also uses the AWS client SDK.
** EDITS **
Note the documentation for the
Body
field includesBlob
, which means streaming will occur:You can also use the Event Emitter convention in the client offered by the AWS SDK's ManagedUpload interface if you care to monitor progress. Here is an example:
If you want to read the file from your local system in chunks before you send to s3.uploadPart, you'll want to do something with Blob.slice, perhaps defining a Pipe Chain.