I’m trying out Duplicacy to possibly replace my current backup method.
I’'m primarily backing up to B2 but decided to do a local backup as well since I had a spare external drive.
The first backup to both B2 and the local disk seemed to go fine.
I decided to do a second backup to the local disk just to see how long it would take as compared to the first. After running for a LOT longer than I was expecting I left it to go overnight. Came in the next morning and the external drive was full and the backup stalled out.
Here’s a bit about my setup.
Running on an Unraid box, Dell R720XD with 24 cores (48 threads) and 128 GB of RAM.
Duplicacy Web 1.6.3 running in Docker.
Storage is to an external 2.5" 2TB HDD over USB3
First backup was 1.5 TB total and used an exclusion list in the UI.
I expected the second backup to check things out, see that pretty much nothing changed and not take up that much more room, if any, bu it ran the disk out of space.
Settings for the storage:
Password protected
5:2 Erasure coding
Copy compatible with the b2 backup
210,529 chunks
I kept the same config as I set for the b2 backup but I realize that the 5:2 encoding probably isn’t necessary. And yeah, I get that an old 2.5 external USB disk is not a great backup target, not planning on relying on it, was mainly testing.
Questions:
- Is this expected due to the way Duplicacy backs things up?
- Should the incremental backup have taken many hours?
- Did it not pick up the exclusions list maybe?
- Thoughts? Recommendations? Is this a bug or expected?
I’m a little scared to fire off an incremental to B2 as now I’m not sure what to expect.