How do I rate limit my Google Drive API calls?

1.4k views Asked by At

I'm working on a node script to download all the images that are shared with my account on Google Drive.

I'm hitting a wall just as others have posted about on Stack Overflow: Google Drive API file watch rate limits.

The exponential backoff makes a lot of sense to me, I just don't know how to go about implementing that.

Any insight I could get into this would be incredibly helpful. Even just a "get started by..." would be great!

I've added the method I'm working on below. Auth and everything is working just fine, it's just a matter of receiving the userRateLimitExceeded error.

Any and all help would be fantastic and greatly appreciated.

Thank you!

/**
 * Download all of the shared images.
 *
 * @param {google.auth.OAuth2} auth An authorized OAuth2 client.
 */
function downloadImages(auth) {
  const gDrive = google.drive({
    version: 'v3',
    auth: auth
  });

  gDrive.files.list({
    q: 'sharedWithMe = true and mimeType = "image/jpeg"'
  }, (err, resp) => {
    if(err) {
      console.log('The API returned an error: ' + err);
      return;
    }

    if(!resp.files.length) {
      console.error('No files found.');
    } else {
      // Remove existing images.
      // removeImages();

      _.each(resp.files, (file) => {
        if(fs.existsSync(IMAGE_DIR + file.name)) {
          return;
        }

        gDrive.files.get({
          fileId: file.id
        })
        .on('end', () => {
          console.log(chalk.green(file.name + ' successfully downloaded.'));
        })
        .on('error', (err) => {
          console.log(err);
        })
        .pipe(fs.createWriteStream(IMAGE_DIR + file.name));
      });
    }
  });
}

EDIT: I looked into batching, but I guess google-api-nodejs-client doesn't support batches. I tried a third-party lib called "Batchelor". Still can't get it to work for the life of me. :(

3

There are 3 answers

2
pinoyyid On

It's not simple, especially from an aync language like JavaScript. Firstly, do NOT do exponential backoff, as I explained in my answer to the question you cited. You will end up with a massive delay between API calls.

You can have a poke around the code in https://github.com/pinoyyid/ngDrive/blob/master/src/http_s.ts This is an Angular 1 service that handles GDrives idiosyncrasies, including 403. It doees it by putting requests into an internal queue, and I then have a process that takes items off the queue at a variable rate to maximise throughput, but minimise the 403's and attendant retries.

Batching makes it even worse, so don't go down that road. It's not the rate of http requests which is the limiting factor, it's the rate of requeststs to the GDrive internal systems. A batch of requests gets fired into GDrive in rapid succession, os is more likely to trigger a 403.

3
KENdi On

Based on this documentation, the error 403 or the userRateLimitExceeded error that you got means that the per-user limit from the Developer Console has been reached.

Just take Note that the Drive API has a:

  • Courtesy daily quota 1,000,000,000 requests/day
  • Courtesy daily quota per second 100 requests/second
  • Per User Limit 10 requests/second/user

Here is the screenshot of the default quota of Drive API

enter image description here

So the suggested action for your error are:

  • Raise the per-user quota in the Developer Console project. (Use this link to ask for more quota)

  • If one user is making a lot of requests on behalf of many users of a Google Apps domain, consider a Service Account with authority delegation (setting the quotaUser parameter).

  • Use exponential backoff.

For more information, check these SO questions:

0
Hilmi Hamdan On

With the rate limits, the one that you have to watch out for is the per user per second.

I used a script that sends a batch of 10 requests every 1.3 seconds, that scripts works fine.

Sending a batch of 100 requests over 13 seconds however starts getting 403 errors partway through.

So the conclusion: watch out for the per second, not per 100 second limit.