GET file from URL and store in S3

50 views Asked by At

I am required to download a .gz file (using GET) from a URL, uncompress it and then store it in S3.

I have written the following code to download file to a directory but I am struggling to uncompress it and store it in S3.

URL= https://api.botify.com/v1/jobs/

def download_file(url, folder_name):
    local_filename = url.split('/')[-1]
    path = os.path.join("{}\{}".format(folder_name, local_filename))
    with requests.get(url, stream=True) as r:
        with open(path, 'wb') as f:
            shutil.copyfileobj(r.raw, f)

How can I download the file from the given URL and unzip and store in S3 ?

Can someone please help.

0

There are 0 answers