AWS lambda task timed out issue with large data while processing data from S3 bucket

1.2k views Asked by At

I have 120 mb file of data in my S3 bucket and i am loading it in lambda by python pandas and processing it but after 15 mins(the time set in timeout option of basic settings) it is giving me an error of task timed out and stopping the process.The same process i am doing in basic sublime text and terminal is taking only 2-3 mins.What is the problem and how can i solve it. Thanks in advance

1

There are 1 answers

0
Chris Williams On

You should try to take a look at the resourcing used within your local machine if you believe that it takes a significantly less period of time. Increasing the amount of memory available to your Lambda can significantly improve performance in circumstances where it is being constrained, this will also increase the amount of CPU.

If there are large volumes of data can this be moved into EFS? Lambda can have an EFS mount attached and accessed as if it is local storage. By doing this you remove this process from your Lambda script and instead can process only.

Finally if neither of the above result in cutting down the time it takes to execute, take a look at whether you can break up the Lambda into smaller Lambda functions and then orchestrate via Step Functions. By doing this you can create a chained sequence of Lambda functions that will perform the original operation of the single Lambda function.