How to upload local files to S3 quickly?

7.7k views Asked by At

I have an angular app. I need to upload it to S3. I've tired to use AWS Console cause I have a lot of directories. How can I upload all my files quickly. Thanks.

Is it possible to send all files in one time?

4

There are 4 answers

4
Asdfg On BEST ANSWER
aws s3 cp --recursive <localfolderpath> s3://<bucketname>/<key>/
0
Chris On

There are some options to allow upload files to S3 bucket:

  • Via AWS console: This option quite straightforward, from AWS console -> go to S3 bucket -> locate the bucket want to upload -> go to folder you want to store the files there -> Click add file/ add folder -> drag and drop file to the placeholder -> click Upload.

  • Using AWS cli (version 2 is good): This option require access to the aws s3 using cli mean that you have to config credential at local machine.Then run aws commands:

    aws s3 cp /path/to/local/file s3://bucket-name/path/to/s3/location/ aws s3 cp /galireview/galireview-demobucket-file s3://galireview-demobucket

or for upload folders:

aws s3 cp /path/to/local/folder s3://bucket-name/path/to/s3/location/ --recursive
aws s3 cp /galireview/galireview-demobucket s3://galireview-demobucket --recursive
  • Using AWS SDK: This option is used for writing scripts or application. This is an example of using Python:
    # Import the boto3 library
    import boto3
    
    # Replace these values with your AWS credentials and S3 bucket name
    aws_access_key_id = 'YOUR_ACCESS_KEY_ID'
    aws_secret_access_key = 'YOUR_SECRET_ACCESS_KEY'
    bucket_name = 'YOUR_BUCKET_NAME'
    
    # The local file path that you want to upload
    local_file_path = '/path/to/local/file.txt'
    
    # The S3 key (file name) under which the file will be stored
    s3_key = 'folder/file.txt'
    
    # Create an S3 client
    s3 = boto3.client('s3', aws_access_key_id=aws_access_key_id, aws_secret_access_key=aws_secret_access_key)
    
    # Upload the file to S3
    try:
        s3.upload_file(local_file_path, bucket_name, s3_key)
        print(f"File {local_file_path} successfully uploaded to S3 bucket {bucket_name} as {s3_key}")
    except Exception as e:
        print(f"Error uploading file to S3: {e}")

For more information please refer to the link: How to upload data from local machine to Amazon S3

3
Alessandro Hoss On

I guess the easiest way is using aws-cli. Like said here:

The sync command has the following form. Possible source-target combinations are:

Local file system to Amazon S3 Amazon S3 to local file system Amazon S3 to Amazon S3

$ aws s3 sync <source> <target> [--options]

The following example synchronizes the contents of an Amazon S3 folder named path in my-bucket with the current working directory. s3 sync updates any files that have a different size or modified time than files with the same name at the destination. The output displays specific operations performed during the sync. Notice that the operation recursively synchronizes the subdirectory MySubdirectory and its contents with s3://my-bucket/path/MySubdirectory.

$ aws s3 sync . s3://my-bucket/path

upload: MySubdirectory\MyFile3.txt to s3://my-bucket/path/MySubdirectory/MyFile3.txt

upload: MyFile2.txt to s3://my-bucket/path/MyFile2.txt

upload: MyFile1.txt to s3://my-bucket/path/MyFile1.txt

You can also check the aws-sdk from this docs:

There is a section with all availables languages and platforms for using the SDK: SDK

And here are some examples of using the sdk for Javascript with Amazon S3

0
kichik On

You can also upload directly from Webpack with webpack-s3-plugin or s3-website.

var config = {
  plugins: [
    new S3Plugin({
      // Exclude uploading of html
      exclude: /.*\.html$/,
      // s3Options are required
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        region: 'us-west-1'
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      }
    })
  ]
}