Is there a way to pass file size as an input parameter from an AWS S3 bucket to a StepFunctions state machine?

670 views Asked by At

I'm currently struggling to configure automated input to my AWS StepFunctions state machine. Basically, I am trying to set up a state machine that is notified whenever an object create event takes place in a certain S3 bucket. When that happens, the input is passed to a choice state which checks the file size. If the file is small enough, it invokes a Lambda function to process the file contents. If the file is too large, it invokes a Lambda to split up the file into files of manageable size, and then invokes the other Lambda to process the contents of those files. The problem with this is that I cannot figure out a way to pass the file size in as input to the state machine.

I am generally aware of how input is passed to StepFunctions, and I know that S3 Lambda triggers contain file size as a parameter, but I still haven't been able to figure out a practical way of passing file size as an input parameter to a StepFunctions state machine.

I would greatly appreciate any help on this issue and am happy to clarify or answer any questions that you have to the best of my ability. Thank you!

1

There are 1 answers

0
Pooya Paridel On

Currently S3 events can't triggers Step Function directly, so one option would be to create a S3 event that triggers a lambda. The lambda works as a proxy and passes the file info to the step function and kicks it off, also you can select data you want and only pass selective data to Step Functions.

The other option is to configure a state machine as a target for a CloudWatch Events rule. This will start an execution when files are added to an Amazon S3 bucket.

The first option is more flexible.