I love duplicacy but sadly have one issue, my duplicacy upload speed when using it gets worse as the upload progresses, or more accurately, the more files already backed up as this persists in further runs.
I am uploading to a GSuite account and using the following command: duplicacy backup -stats -vss -threads 8
The repository is about 4TB in size.
I noticed high memory usage, then I read somewhere about a month ago that the ‘DUPLICACY_ATTRIBUTE_THRESHOLD’ environment variable being set to ‘1’ helps, so I’ve done this and now only 8/16GB of RAM is used total on the machine (previously Duplicacy would use all available RAM) but see no improvement.
The speed I had at the start of the upload was maxing out the connection, but now I am getting is about 2 megabits per second towards the end of the upload, and stays that slow on subsequent runs even now when the initial upload has completed. This resulted in the initial upload taking a month or two to complete. A speedtest on the same computer returns a stable 35 Megabits upload speed.
I am running Windows Server 2019 with the latest updates installed, with duplicacy command line 2.7.1. I recently upgraded from 2.6.2 which had the same issue.
I feel like there’s an obvious mistake I’m making or a switch somewhere I need to toggle, any ideas what could be causing the speed issues?