I would go a step further and do just this:
duplicacy_win_x64_2.7.2.exe check -fossils, just to keep it simple. But generally, yes.
To elaborate, this will not validate content of the files, and relies on you trusting wasabi or whatever other backend you will be using to not corrupt files. There have been issues in the past with various backends (OneDrive that would keep partial file during failed upload leaving corrupted chunk, Backblaze returning bad data via API, etc, those issue were backend’s fault and fixed since. you can search this forum for the reference). There might be similar intermittent issues with other backends in the future and ultimately the only reliable test of whether you can restore your files is to actually try to restore the files (which
-chunks -files comes very close to emulating).
But running that kind of tests periodically will not be feasible or productive (as this will at least result in full download of the whole dataset), and the only reason for doing it anyway would be if the storage provider cannot be trusted. Why use it then? Switching to another would be saner choice.
The thing is one should draw a line somewhere – trust the RAM works, trust CPU is computing correctly, TCP does not corrupt packets, encryption is not broken, and storage companies know what they are doing.
I would also trust duplicacy to work properly, but just checking for presence of chunks is cheap so might as well do it. Personally for the many years of [ab]using Duplicacy all the issues the check command encountered for me were self-inflicted or false positives (such as when prune deletes chunks but then fails to delete the snapshot file – that snapshot file will not pass check next time obviously. This is why IMO duplicacy should first delete snapshot file and only then delete chunks. Maybe even do this a two step process – fossilize the snapshot, and then every prune delete chunks from fossilized snapshots, and only once all are gone – delete the fossil. But that’s different conversation).
An alternative solution would be to never prune and never check: if you don’t let duplicacy delete anything – nothing can get corrupted. Storage is cheap… but then again, this is all covered by simple and cheap check that only requires download of a small amount of metadata.