Question - Deduplication across parallel backup sets

Hello,

I’ve used Duplicacy for a while and love it, but I’m starting a new set of backups.

I’ve got three backup sets going to the same destination with plenty of duplicates between them. If I hit start on them in parallel, will deduplication work for files that exist across sets?

Or should I do them one at a time in series?

Thanks!

You can run in parallel. It will work as expected.

You might encounter issues with backends that are eventually consistent – such as google drive – but those issues can be fixed later (with rclone dedupe), and they are not specific to running multiple instances of duplicacy, or duplicacy in the first place, but rather creating folders from multiple threads, (what mostly happens on initial backup). (Yhe problem is that you may end up with duplicate folders under certain conditions).

Otherwise it’s perfectly safe and it was actually designed to work concurrently. More here: Duplicacy paper accepted by IEEE Transactions on Cloud Computing

1 Like

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.