Download large file on Google App Engine Python

1.1k views Asked by At

On my appspot website, I use a third party API to query a large amount of data. The user then downloads the data in CSV. I know how to generate a csv and download it. The problem is that because the file is huge, I get the DeadlineExceededError.

I have tried tried increasing the fetch deadline to 60 (urlfetch.set_default_fetch_deadline(60)). It doesn't seem reasonable to increase it any further.

What is the appropriate way to tackle this problem on Google App Engine? Is this something where I have to use Task Queue?

Thanks.

2

There are 2 answers

6
koma On

DeadlineExceededError means that your incoming request took longer than 60 secs, not your UrlFetch call.

Deploy the code to generate the CSV file into a different module that you setup with basic or manual scaling. The URL to download your CSV will become http://module.domain.com

Requests can run indefinitely on modules with basic or manual scaling.

0
Mike On

Alternately, consider creating a file dynamically in Google Cloud Storage (GCS) with your CSV content. At that point, the file resides in GCS and you have the ability to generate a URL from which they can download the file directly. There are also other options for different auth methods.

You can see documentation on doing this at

https://cloud.google.com/appengine/docs/python/googlecloudstorageclient/

and

https://cloud.google.com/appengine/docs/python/googlecloudstorageclient/functions

Important note: do not use the Files API (which was a common way of dynamically create files in blobstore/gcs) as it has been depracated. Use the above referenced Google Cloud Storage Client API instead.

Of course, you can delete the generated files after they've been successfully downloaded and/or you could run a cron job to expire links/files after a certain time period.

Depending on your specific use case, this might be a more effective path.