I use containerized lambda functions.
This is the Dockerfile:
FROM public.ecr.aws/lambda/python:3.10
COPY requirements.txt .
RUN pip install -r requirements.txt
RUN pip install --upgrade boto3 # Just to be safe
RUN pip install --upgrade botocore # Just to be safe
COPY ./main.py ${LAMBDA_TASK_ROOT}
CMD ["main.lambda_handler"]
This is main.py.
import asyncio
import aioboto3
def lambda_handler(event, context):
key = "test"
bucket = "test"
loop = asyncio.get_event_loop()
result = loop.run_until_complete(test_key(bucket, key))
return result
async def test_key(bucket: str, key: str):
try:
session = aioboto3.Session()
async with session.client("s3") as client:
response = await client.head_object(Bucket=bucket, Key=key)
return True
except Exception as e:
return False
requirements.txt
aioboto3
asyncio
When invoking the function (either locally or inside an online Lambda environment) I get the following error:
{"errorMessage": "Unable to import module 'main': No module named 'botocore.compress'", "errorType": "Runtime.ImportModuleError", "requestId": "87dcc2be-d84c-4725-8f2f-663219c6f43c", "stackTrace": []}
It appears that this has to do with how AWS is running the Python images. From my experiments outlined below, it does not appear that Lambda uses the
botocoreinside the image; the image is treated as just a "box" for the files. The execution environment, including the version ofbotocore, is provided by the Lambda service.At the time of writing, Python 3.9 and 3.10 images are using boto3+botocore 1.26.90, while Python 3.11 using
1.34.34. As Bert have highlighted above,botocore>=1.31.14is required forbotocore.compress, so currently Python 3.9 and 3.10 would not be able to runaioboto3at all.The test is conducted by a simple
lambda_handlerof:Dockerfile:
public.ecr.aws/lambda/python:3.11returns:public.ecr.aws/lambda/python:3.10returns:public.ecr.aws/lambda/python:3.9returns: