Consoildating Subfolder Backups into One

Hi. I have backed up roughly 10 directories that are all part of the same parent directory. I’m thinking it would be much simpler to just backup the parent folder as one job rather than 10. If I delete all the current subfolder backup jobs/id’s from Duplicacy, and add the parent backup as a new job/id, will all of my data have to be uploaded as fresh? Any other considerations?

If this is not a problem, would it be better to add the parent directory job, back it up, and then delete the subs? Or would it make better sense to delete the sub jobs first?

I use the GUI btw.

Thank you!

I would add the parent, and keep the old ones ones intact, only deleting the schedules; I would not delete snapshot data to preserve the version history.

Thanks for your reply. Taking your advice I just initiated a backup for the parent directory, leaving the sub jobs in place. With almost no changes to the files since my last backup of the subs, I’m seeing 8 hours to backup at 105 MB/s. There’s roughly 3TB in total in the parent. That tells me that it’s backing them up from scratch? Or is it some other process that would require so much time? If from scratch, can i expect to have duplicates at B2?

Thank you.

Most of those chunks shouldn’t get re-uploaded.

Because it’s a new backup, it has to hash the files locally, so it’s reprocessing the content and determining if it exists on the destination (which most of it already is). Only a very small handful of metadata chunks will get uploaded.

Will take a bit longer compared to an incremental backup, but shouldn’t impact on B2s transaction API prices too much.

If you need to see what it’s doing better, add -v as a global option for the backup command.

1 Like