S3 feature to publish to multiple buckets

402 views Asked by At

We are currently publishing data to an S3 bucket. We now have multiple clients to consume this data that we stored in our bucket. Each client wants to have their own bucket. The ask is to publish data to each bucket.

Option 1: Have our publisher publish to each S3 bucket. cons: More logic on our publishing application. Handle failures/retries based on clients.

Option 2: Use S3's Cross-region replication reason against it: Even though we can transfer objects to other accounts, Only one destination can be specified. If source bucket has server side encryption we cannot replicate.

Option 3: AWS Lamba. Have S3 invoke Lamba and lamba publish to multiple buckets. confused: Not sure how different this is from option 1.

Option 4: Restrict access to our S3 bucket with read only. Have clients read from it. But wondering how clients can know if an object is already read! I do not prefer time based folders, we have multiple publishers to this S3 bucket and clients cant know for sure if the folder is indeed complete.

Is there any good option to solve the above problem?

1

There are 1 answers

0
Mark B On

I would go with option 3, Lambda. Your Lambda function could be triggered by S3 events so you wouldn't have to add any manual steps or change your current publishing process at all.