Upload Speed Gets Progressively Worse

Hi there.
I love duplicacy but sadly have one issue, my duplicacy upload speed when using it gets worse as the upload progresses, or more accurately, the more files already backed up as this persists in further runs.

I am uploading to a GSuite account and using the following command: duplicacy backup -stats -vss -threads 8

The repository is about 4TB in size.

I noticed high memory usage, then I read somewhere about a month ago that the ‘DUPLICACY_ATTRIBUTE_THRESHOLD’ environment variable being set to ‘1’ helps, so I’ve done this and now only 8/16GB of RAM is used total on the machine (previously Duplicacy would use all available RAM) but see no improvement.

The speed I had at the start of the upload was maxing out the connection, but now I am getting is about 2 megabits per second towards the end of the upload, and stays that slow on subsequent runs even now when the initial upload has completed. This resulted in the initial upload taking a month or two to complete. A speedtest on the same computer returns a stable 35 Megabits upload speed.

I am running Windows Server 2019 with the latest updates installed, with duplicacy command line 2.7.1. I recently upgraded from 2.6.2 which had the same issue.

I feel like there’s an obvious mistake I’m making or a switch somewhere I need to toggle, any ideas what could be causing the speed issues?

Can you add the -d option to see if you’re hitting the rate limit?

duplicacy -d backup -stats -vss -threads 8

Just tried this while resuming an upload that disconnected last night, here’s the output:

Skipped chunk 8693aeb0f80540021d880c9e2a995ed276f9a666fe5cdfe5c47785d107e25edc in cache
Skipped chunk 18 size 2449286, 9.69MB/s 00:02:33 3.1%
Chunk 7a935c96df6e02cd426130f47d562cb81cf680145b878e5958a93fb0dac4ef7b already exists
Skipped chunk 7a935c96df6e02cd426130f47d562cb81cf680145b878e5958a93fb0dac4ef7b in cache
Skipped chunk 17 size 1751946, 10.02MB/s 00:02:28 3.2%
[1] HTTP status code 502; retrying after 1.10 seconds (backoff: 2, attempts: 1)
Chunk 81f229f41bcfea6b5b3b35e75b28b0063a2ff913799104134181e0d1f6c14265 has been uploaded
Uploaded chunk 22 size 4813839, 394KB/s 01:03:45 3.5%
Chunk b6fd7e8134c5c5e271f1ddfaf792352d65457272876508a6a0e77f468046bf57 has been uploaded
Uploaded chunk 16 size 5278232, 425KB/s 00:59:00 3.9%
Chunk 33285ff64c7986b33bc038665ee37d1e0ada0c8c816b3bad5dd88448a274a3b5 has been uploaded
Uploaded chunk 21 size 4964701, 452KB/s 00:55:15 4.2%
Chunk ce7c2983d8b93ad5f2f0ca1fdec1ca9fa24acb110ab66d833da0a6a79c0669bb has been uploaded
Uploaded chunk 23 size 1472943, 363KB/s 01:08:48 4.3%
Chunk e13ea220e92884277ddb3b0f49093ee4a039dee883fc9d5d73cbc5e86f2da70f has been uploaded
Uploaded chunk 25 size 1883133, 345KB/s 01:12:18 4.4%
Chunk d4cc9e77c9d5950d4026330de8061740998389b02019310ed5c2c5065f21a79d has been uploaded
Uploaded chunk 24 size 2025157, 348KB/s 01:11:35 4.5%
Chunk 366481adb111f59181feb5faed8cfb4c5f505ff1ae9378b916b8f01281d158a1 has been uploaded
Uploaded chunk 19 size 8497690, 339KB/s 01:13:06 5.0%
[3] HTTP status code 502; retrying after 2.99 seconds (backoff: 2, attempts: 1)
Chunk 7173b78bfd73e572a104075dd5e7204a082f9deb529a09d64b5ad0e69108c0f5 has been uploaded
Uploaded chunk 15 size 10541500, 315KB/s 01:17:57 5.7%
Chunk bc3ccb76c41ae02df3bcd30cb3e76e106d61473654f96859c8bd6b850b2a25b4 has been uploaded
Uploaded chunk 29 size 3517589, 278KB/s 01:28:03 5.9%
Chunk a21ac687abd613dead6c6ede9d878f7367b561903e4aba4c4127aa7cd61ee7ec has been uploaded
Uploaded chunk 20 size 12530828, 309KB/s 01:18:34 6.7%

The 502’s are unusual…
Is there another part to look at to identify if it’s rate limiting that I haven’t included?

@gchen Is there any other information I should provide?

I don’t know why Google returned 502 but that seems to be an temporary error. Do you still have this problem now? You can run the CLI command duplicacy benchmark to test the upload speeds.