Instead of specifying the long list of exclusions for each backup task of the same source — create one exclusion list file, and include it into both. I.e. your exclusions will look like so:
@/Users/james/exclusions.txt
Alternatively, you can keep exclusion metadata along with the data with .nobackup files. Then you don’t need to worry about maintaining exclusion file separately.
It would be ideal if I could define a single backup routine (i.e. backup these files from this folder, etc.) and define/schedule multiple locations to send that backup job to. If that’s possible then I have no issue with having multiple locations.
No, that is not possible. But the supported way is not much different.
Define two backup targets, and define two backup schedules to those targets. Both schedules can be different or same — it’s literally just a time, source, and destination.
It’s not worth trying to go unsupported way for the sake of saving those few mouse clicks.
I sleep better at night with a copy of my data in cold storage.
Essentially you trust your solitary hard drive with no redundancy more than commercial datacenter. This may feel better, but does not reflect reality.
Cloud + NAS is a great starting point
It’s a destination, actually.
but one ransomware attack could easily wipe out both of those
It’s impossible with right setup. Server side snapshots, immutable keys, and other common techniques eliminate that risk completely.
With your hard drives though — who’s to tell your directly connected disk will survive? It will be encrypted first. And you don’t have any recourse like snapshots. All ransomware needs to do is sneakily encrypt a single config file on the destination and your backup is suddenly a pumpkin. If anything — connecting a data drive to an infected computer is not a good plan right there. (And you can’t really know when it’s infected)
(I do have sufficient planning against ransomware but it’s still possible.) Having cold copies of my data in an offsite location provides an additional layer of protection if the worst happens.
It gives an illusion of safety without improving anything, which is arguably worse.
Think about it this way: will a say a Credit Union or a bank mess with connecting hard drives to the server and moving them around or will they backup to AWS with carefully configured credentials, snapshotting, and bucket immutability that will entirely eliminate the risk of losing data to the attacks? And they are arguably higher value target. And then commercial datacenter will keep your data safer, cheaper, and for longer than a bunch of lose disks with a chore.
In other words, in the absense of better reasoning — do what corporations do. And they definitely don’t move a bunch of drives around.
But few people provided you quite a bit of reasoning here, so there is even fewer reasons to stick to “feel good” setup. Unless that’s the goal, simply as a means that accomplishes better sleep, with the understanding that it actually does not improve data safety.
Search this forum, there is a detailed explanation on how to configure immutable backup to B2. Until duplicacy supports Glacier storage B2 is the most cost effective online target.