Unable to access files from public s3 bucket with boto

3.9k views Asked by At

I recently created a new AWS account (let's call it "Account A") and created an S3 bucket in this account (let's call it "bucketa"), uploading a file foo.txt. Following advice from the internet, I set up what I believe to be the most permissive bucket policy possible (it should allow any type of access by any user):

{
  "Version": "2012-10-17",
  "Id": "PolicyABCDEF123456",
  "Statement": [
    {
      "Sid": "StmtABCDEF123456",
      "Effect": "Allow",
      "Principal": "*",
      "Action": "s3:*",
      "Resource": [
        "arn:aws:s3:::bucketa/*",
        "arn:aws:s3:::bucketa"
      ]
    }
  ]
}

After creating an IAM user for Account A ("Identity & Access Management -> Users -> Create New Users" and creating a user with "Generate an access key for each user" checked) and storing that user's credentials in ~/.boto, a simple script using the boto S3 interface can access the uploaded file foo.txt:

import boto
conn = boto.connect_s3()
b = conn.get_bucket("bucketa", validate=False)
k = boto.s3.key.Key(b)
k.key = "foo.txt"
print len(k.get_contents_as_string())
# 9

I then created a new AWS account (let's call it "Account B"), and followed the same steps, storing IAM credentials in the .boto file and running the same python script. However in this case I get a 403 error when executing the line print len(k.get_contents_as_string()):

Traceback (most recent call last):
  File "access.py", line 7, in <module>
    print len(k.get_contents_as_string())
  File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 1775, in get_contents_as_string
    response_headers=response_headers)
  File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 1643, in get_contents_to_file
    response_headers=response_headers)
  File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 1475, in get_file
    query_args=None)
  File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 1507, in _get_file_internal
    override_num_retries=override_num_retries)
  File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 343, in open
    override_num_retries=override_num_retries)
  File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 291, in open_read
    self.resp.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>7815726085F966F2</RequestId><HostId>EgFfldG4FoA5csuUVEKBq15gg3QQlQbPyqnyZjc2fp5DewlqDZ4F4HNjXYWQtBl5MUlSyLAOeKA=</HostId></Error>

Why is Account B not able to access bucketa despite its very permissive bucket policy? Are there additional permissions I need to set to enable public access by other AWS accounts?

Note: I have already ruled out invalid credentials in Account B's .boto file as the culprit by creating an S3 bucket bucketb from Account B with the same bucket policy ("bucketa" replaced with "bucketb" in two lines); in this case I can access bucketb with Account B's credentials but get the same 403 error when using Bucket A's credentials.

1

There are 1 answers

0
garnaat On BEST ANSWER

The policy you have allows anonymous users access to the bucket. However, in your case Account B is not an anonymous user, it is an authenticated AWS user and if you want that user to have access you would need to grant it explicitly in the policy. Or, you can just access it anonymously in boto:

conn = boto.connect_s3(anon=True)

That should do the trick. It probably goes without saying but I wouldn't leave that policy, as is. It would allow anyone to dump anything they want into the bucket.