Rate limit exceeded with google, then completely starts over

With my backups, after multiple hours of uploading, i get the error that the rate limit exceeded, using google drive. Then after an hour, the backup retries and always ends up on the same error. But it seems that duplicacy always overwrites everything again and does not remember where it stopped working.
Is there a way to make duplicacy work with big long backups?


Duplicacy should always continue from where it left off - or rather, it shouldn’t re-upload what was already uploaded. You should notice a lot of skipped chunks in your logs.

Otherwise, if you’re talking about Google Drive, there’s a 750GB daily upload limit.

So if your bandwidth exceeds that, you may want to use -limit-rate 9000 (think that’s about right) until the initial backup is completed.