I am quite new to duplicacy and still trying to set it right.
In my case, I had a lot of different directories spread out among different drives, computers, old backups etc. So first what I did was one large copy of all files into one Backup directory. Lucky enough I had space for this.
Then what I did was running duplicacy to check deduplication possibility. As predicted - from more then 13TB of data, final backup takes less then 6TB… What a mess I got in these drives… Dedupe was great idea.
Then I started to add/delete some files there to make it a little cleaner (BACKUP, source directory) and start to run next duplicacy jobs each time I made some changes. Meaning that snapshots are getting bigger, and it now my destination space exceeded 6TB.
Next week I should be almost ready with my cleanup so here are my questions:
- should I totally delete my backup and restart new job and treat it as clean base one (long time to create as I use NAS with Celeron for this);
- or can do prune with setting to leave last snapshot? Will this release some space and then data in duplicacy destination will go lower then 5TB (my prediction)?
Or:
How I can force duplicacy to delete files that are no longer in source? I think there is no way then prune with good settings, yes?
And my last question for my 321 backup plan:
Is there a chance to divide destination into 5 different spaces?
I have family Office 365 with OneDrive 5x 1TB and could use this to copy my backup. Ideal secure plan for me, but can it be done somehow in duplicacy in such way (one job, 5 different credentials and Ondedrives) or I should manually divide directories and copy to 5 accounts?