How to upload to s3 using boto3 asynchronously

211 views Asked by At

I am using aiobotocore to make uploads to s3 asynchronous, I am getting the error "cannot reuse already awaited queue" when using same session for subsequent request.

Client created at class level and being reused for subsequent calls

class s3Bucket:
    def __init__(self) -> None:

        # Configure your S3 credentials and bucket information
        self.access_key = os.getenv('access_key')
        self.secret_key = os.getenv('aws_secret_key')
        self.bucket_name = os.getenv('bucket_name')
        self.bucket_region = os.getenv('bucket_region')
        self.aws_s3_bucket_http_path = os.getenv('aws_s3_bucket_http_path')
        self.quality = 60 #level of compression(1-95)
        self.logger = logger
        self.client = get_session().create_client( #creating client
            's3',
            aws_access_key_id=self.access_key,
            aws_secret_access_key=self.secret_key,
            region_name=self.bucket_region
        )
        
        pass

Calling the above made client like this:

            async with self.client as s3_client:
                await s3_client.put_object(
                    Bucket=self.bucket_name,
                    Key=self.key_name,
                    Body=BytesIO(self.compressed_image)
                )

I don't want to create a new client for very request, as I have observed that creation takes more time than uploading to s3

0

There are 0 answers