I’ve had a long term plan of 1) backing up all the computers in our household to my home server and then 2) backing up that to the cloud.
In effect I see the backup on the home server as covering accidental file deletion or a hard drive failure whilst backing up the repo on the server to the cloud is about covering bigger disasters like theft or fire.
I finally seem to have part one sorted with Duplicacy (having had abortive attempts with urbackup, borgbackup, duplicati, partial success with duplicity…) but am now wondering about part two. How to backup the repo on my home server to the cloud?
I could add a second store for each machine, but my main backup already takes 30mins-ish and as far as I’m aware Duplicacy just repeats that for each store. So I’m more inclined to copy the repo on the home server to the cloud - something that can be done at night when no one is needing the bandwidth or cpu cycles.
The repo is on a ZFS pool, so ZFS copy would be cool but seems very expensive at the moment.
Using ‘duplicacy backup’ on the duplicacy repo seems nonsensical…so I’m wondering about ‘duplicacy copy’. Would it be bad to repeatedly run copy to keep a cloud based copy of my repo?
Other than that I may try a different tool, like Tarsnap or even simple rsync.
Interested to hear if anyone else is doing anything like this and whether ‘copy’ is suitable for the job.