Go is pretty much a cross-platform language and Duplicacy already has the OS-specific parts written. So you could compile it on Mac, make backups on Mac, then compile it on Windows, restore from the same backup storage on Windows.
But really, the main gist is, you can compile it yourself if you wanted/needed to - and the source code details how those chunks are encoded/decoded - but it’s enough to just hang onto the CLI executables in order to restore your data, because they’re standalone.
The well-known CrashPlan backup software/service also has proprietary chunk-based backup storage. Except it’s not open source, and they heavily rely on a database to properly ‘index’ backup data.
When they ended their Home-tier service (including their free PC-to-PC option), bam all that data became inaccessible, because they linked their backup storage to cloudy accounts and basically deactivated everyone’s clients. The problem wasn’t the chunk-based storage; it was the reliance on proprietary data formats, cloud-reliance, and zero source code. Had there been source code, local backups would be restorable.
Duplicacy gives you the source code, doesn’t rely on cloudy accounts (except perhaps the Web Edition - though the backup engine certainly doesn’t), and it’s designed in such a way as to not require complicated database to index stuff.
So long as the integrity of your backup storage is good, and you have the decryption passphrase, you can restore long-term backups without relying on anybody.
You don’t have to use ZFS but, as with any backup solution, you should test your backups. Duplicacy has pretty good integrity checks but storage isn’t infallible. Just don’t rely on a single NTFS/HFS+ drive as your only other copy.
Duplicacy lets you copy
backup storage to multiple destinations, so it’s good practice to have multiple copies but also multiple types (i.e. not just Duplicacy). i.e 3-2-1. By copying backups from one Duplicacy storage to another - say from local to cloud - Duplicacy effectively validates the data as it’s being decrypted and re-encrypted onto the destination. Setup email alerts etc., conduct regular test restores, and Duplicacy can be a very robust solution.
If you backup directly to your own cloud, you’ll have less to worry about bit rot (in theory) but you should always have more than one copy.
This comes down to your initial worries about being locked into Duplicacy’s chunk format, and why the 2 in 3-2-1 is really all about having different methods - not just media.
My advice is to employ another backup method in addition to Duplicacy, such as disk image style backups. I use Veeam Agent for Windows, for example. They have a client for Linux, doesn’t look like they have one for Mac at present. However I gather there’s Time Machine and things like Acronis True Image. It wouldn’t be for long-term archival, but it could complement Duplicacy in case of emergency. Remember, Duplicacy is backup software, and backups don’t necessarily make good archives. Consider selectively archiving relevant data using appropriate means. (For instance, I don’t use Duplicacy to keep copies of large media files - they get Rclone-copied to cloud and checksummed along with lots of metadata. Add as many remote copies and you can muster.)