Googleapi: Error 403: User Rate Limit Exceeded

Am I using too many threads? (Is even chunks listing multi-threaded?)

Simply re-running the same command will sometimes work, sometimes not.
This morning is the first time I got this error.
I am not running any other duplicacy commands against Google Drive in parallel.

Command: /usr/local/bin/duplicacy -log check -persist -r 32 -chunks -threads 32
2020-10-09 09:19:44.063 INFO STORAGE_SET Storage set to gcd://duplicacy-backups
2020-10-09 09:19:45.880 INFO SNAPSHOT_CHECK Listing all chunks
2020-10-09 09:21:25.843 INFO GCD_RETRY [0] Maximum number of retries reached (backoff: 64, attempts: 15)
2020-10-09 09:21:25.843 ERROR LIST_FILES Failed to list the directory chunks/: googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console:, userRateLimitExceeded

Is this a g-suite? Then you can do what the error message advises.

If this is consumer google account — then this is just another reason why file sharing and collaboration services are not suitable for bulk data storage workload that backup tools generate.

It is a G Suite, but the error message in question directs me to a project I do not own; it’s the project owned by duplicacy. I’m only a user of that application, and thus I have no control on the Quotas of that project in the dev console.

I’m not sure I understand. Is this not your backup destination? What do you mean “owned by duplicacy”? Someone else is managing storage for you? Can you ask them to increase api quotas?

You can create “projects” in your account and configure a bunch of parameters including quotas on them as well as generate project specific access credentials (I did that only once and long time ago and I’m in no way expert on this but it does seem like a right approach here)

When configuring gcd storage, we’re asked to go to Google Drive for Duplicacy to obtain a token; that uses project 243147021227, which is called “Duplicacy Backup Tool”.
The app is owned by duplicacy; gchen created the project in his Google dashboard.

1 Like

The current per-user quota for Duplicacy is 2000 queries per 100 seconds. 32 threads may be too many; try -threads 16 instead. Currently this -threads option specifies both the number of listing threads and the number of downloading threads. We may need to separate it into two options.

1 Like

Hmm. Indeed, I missed that the tokens are generated from a single project; which makes sense because some users may not be using g-suite and therefore cannot create access credentials for their drive. And I guess those who do have g-suite might be better served by creating their own project.

What impact does compromise of gchen’s account have on users access? I’d assume the worst that can happen access credentials can be revoked. Anything else? (I’m not very familiar with managing g-suite projects really)

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.