Python zappa task for uploading files to S3 bucket

109 views Asked by At

I've tried to move upload photos functional to zappa task. views.py

 def perform_create(self, serializer):
        photos = dict((self.request.data).lists())['photos']
        for photo in photos:
            byte = base64.b64encode(photo.read())
            upload_photos(byte.decode('utf-8'), photo.name)

tasks.py

@task
def upload_photos(photo, name):
    byte_data = photo.encode(encoding='utf-8')
    b = base64.b64decode(byte_data)
    img = Image.open(io.BytesIO(b))
    width = img.width
    height = img.height

    with open(name, 'rb') as file:
        picture = File(file)
        Images.objects.create(photo=picture, width=width, height=height)
    os.remove(name)

Locally everything works fine, photos uploaded to S3 bucket. But when I deployed zappa to AWS I've got the error on img.save():

oserror: [Errno 30] Read-only file system: 'file-name.jpg'

As I understand, the problem with temporarily saving file in memory. Maybe I have to specify tmp path? Any ideas how can I fix it?

1

There are 1 answers

0
Anna Berezko On BEST ANSWER

Here is my solutions, it perfectly works with zappa+S3 and locally. You can use this logic in serializer create method also: views.py

    from django.core.files.base import ContentFile
    from django.core.files.storage import default_storage


    def perform_create(self, serializer):
        photos = dict((self.request.data).lists())['photos']
        for photo in photos:
            path = default_storage.save(f'tmp/{photo.name}', ContentFile(photo.read()))
            return s3_upload_photo(path, photo.name)

tasks.py

@task()
def s3_upload_photo(path, file_name):
    path_object = Path(path)
    with default_storage.open('tmp/' + file_name, 'rb') as file:
        photo = File(file, name=path_object.name)
        width, height = get_image_dimensions(photo)
        instance = ModelName.objects.create(photo=photo, width=width, height=height)
    default_storage.delete(path)
    return instance