Google Drive 400 error

I can’t find anything about this error in duplicacy forums, nor in the google forums :frowning: Any thoughts?
This is in Debug mode. This is also after a lot of other chunks were successfully verified and/or uploaded.

2020-02-17 09:49:41.847 ERROR UPLOAD_CHUNK Failed to upload the chunk 0d91cb8e8c75ac052ef3022ff3ada9a24a79570c54650e15465cfdb57c83bdd1: googleapi: Error 400: Bad Request, failedPrecondition
Failed to upload the chunk 0d91cb8e8c75ac052ef3022ff3ada9a24a79570c54650e15465cfdb57c83bdd1: googleapi: Error 400: Bad Request, failedPrecondition  ```
1 Like

I’m having the same problem. Pls help!

I just tested and it worked for me. Does the problem still exist? Same chunk every time?

Not same every time. It usually goes for a while and then stops. Usually I’ll make it anywhere from 200-400 chunks of default size before the problem. If I call backup again it will again run for a while before having the error. I’m using version Duplicacy CLI 2.3.0 (504D07) on Ubuntu 18.04 x64.

Edit: And now it seems to be working fine. Maybe the problem was on Google’s end. I’ll update if problem shows up again, thanks.

Edit: It took longer than before but after about 1000 chunks I got the error again, so still an issue.

One odd symptom of the problem I’ve noticed: say the backup made it to chunk 1000 before stopping bc of the “googleapi: Error 400: Bad Request, failedPrecondition” error, when I restart the backup it won’t skip chunks until chunk 1000, it might skip up to chunk 500 instead of 1000 and then start uploading chunks from there.

Edit: And now it seems to be working again, sigh. Made it to 9500 chunks w/o error message.

I kept running into this on different chunks about 6 or 7 times. Finally, it finished successfully.
I’m unsure if this was a transient problem with GDrive, or maybe it was some new rate limiting. Next time I have a large set of chunks to upload, I’ll see whether it comes back.

This looks pretty similar to the error I was getting when I’d hit 750GB uploaded in a day. There’s (poorly) documented limit of 750GB uploaded a day (as well as 10TB downloaded in a day), though it appears to be per account.

Are you hitting this around 750GB uploaded per day?

I definitely may have hit that. Does it make sense to rate limit the backup so it comes in just short of that in a 24 hr period?

That may be the easiest way to not have the problem. I just manually restarted them after midnight my time until I put a rate limit on them that kept them under the limit.

This 750 GB limit is per user per day:

https://support.google.com/a/answer/172541?hl=en

I’ve never hited this limit, but Rclone users use the –bwlimit = 8650k parameter to stay below that.

I think the equivalent parameter in Duplicacy is -limit-rate 8650

1 Like

I was constantly hitting it when I was doing my initial backup, usually mid-evening with a restart around midnight.

To add to this, when they say 750GB (not Gb) per user, they mean per email address. Two computers backing up to the same spot will both count against this limit. Some people have gotten around this by paying for multiple email/drive accounts. Apparently there’s another soft limit of around 150TB stored across everyone.