Initial backup - Google Drive and limit-rate

I am new to duplicacy and setting up my initial backups to my GSuite storage.

I have configured 6 backup jobs (a total of about 17.5 TB) that I am running individually, that I plan to add to a daily schedule once I get past this initial hurdle.

Only 2 of the backups are multi-terabyte. So my inital attempts fails because I was hitting that cap, upload was going about 16-18MB/sec. I did the math and figured I could limit the upload to 8.68 MB/sec I would come in just under the daily limit and get it done as fast as possible.

So I used a limit-rate of 8600 on the backup just to be safe. However the backup is running at a very stable 5.58-5.59 MB/sec. Any ideas?

If you increase the rate limit to, say, 10000, does the upload speed increase proportionally?

I did that for another backup to test. The speeds reported went wide open on that job.15-18MB/sec.

I am runnning in a docker container and the job was created in the UI.

Here is the other job

If the speed shown is greater than the rate limit, that is because it takes deduplication and compression into account. You can find out the actual network speed using a os-specific utility tool, such as Task Manager on Windows or Activity Monitor on macOS.

Understood for the over speed. Its the under speed use case that is the one I am trying to solve. I am backup up a lot of data and this is off by 30%. Here is the resource monitor from the NAS itself showing it is not running at 8600.

image

I have noticed too, that If I set the limit speed to 8600, it’s actually uploading slower by around 30%. (I can see on Google Drive that the space used is increasing by less than 750 GB a day)