File Deduplication, will it work?

Hi again!
I have a question about file deduplication.
I currently work with 2 HD’s. There’s Archive, for older files and “Live” which has the files i’m working for the past year.

I have setup a backup to B2 for the Archive and another for the “Live”. Question is:
Will Duplicacy know that files from “Live” are the same when I move them to Archive?

Pretty much, yes.

Apart from maybe a handful of chunks at the start / end boundaries where these folders join together in the folder structure, plus metadata chunks which probably don’t amount to very much in terms of size, plus maybe some rehashed chunks (equivalent to the -hash option) if you’ve been moving lots of files around over several revisions - again, not much.

Basically, most of everything should be deduplicated though you may notice it processing these chunks and skipping them; the speed in MB will probably increase but it’s not uploading these skipped chunks.

3 Likes

If you add -dry-run as an option, it should should show you how much it would need to upload — without actually uploading anything. It will probably list all of the files as uploaded since they’re modified; but if you look at the bytes uploaded vs new, you should get an idea of how well it’s deduplicating.

2 Likes

Thank you guys!

When the time come I will test with -dry and see what happens.
This is great!
Will save me a lot of time… I produce around 1TB/year and It takes days to upload everything from scratch.