Just purchased a couple license to Duplicacy having tested it out and found it to be a very robust solution. Now I’m working on getting backups all ship-shape.
I want a backup process that has my various PCs/Macs backing up to two places - a Linux-based server that is only sometimes available (when they’re at my house - which they aren’t a lot of the time), and also to the cloud (I’ve chosen b2 on that side of things). The server also backs up to the same storage - both locally (backups are on a second HDD) and to b2.
I have two ways to approach this: everything backs up to two locations, or everything backs up to the server, which then copies its backups to b2. Both could work, but the former seems more efficient in a whole lot of ways.
First off, is this possible? SFTP/network drives use a different structure from cloud servers, so does “copy” even work in that situation?
I have also already seeded my backups to BackBlaze (pulled the server’s HDD and ran a normal backup of its own files). If I switch to a backup-then-copy, do I need to re-seed with a copy command or can I somehow get these back in sync in a copyable fashion?
Are there differences in bandwidth or speed between a copy and a backup? My biggest limiter is bandwidth at home, where I have < 1 mbit upstream.
Just looking to understand all the pieces and parts here.