But I really think this would not be a proper way to use the filters file.
Right, because the filters file is more of a permanent setting, whereas the feature I need is specific to every call to the backup command.
I think the best way would be simply let Duplicacy scan the folders.
We already established that this can be very inefficient.
I am trying to get duplicacy do the same as Time Machine does, i.e. back up changed files every 30 minutes, with a history, but to a remote location instead of to a local disk. Time Machine is a great tool to keep automatic file history on a file system that’s not auto-versioning. I’ve relied on TM for years, and if duplicacy can be a replacement for TM, with the additional option to save to “Cloud” storage, it can be quite successful on the Mac.
Ever since I’ve got to know duplicacy about 2 or 3 years ago, I was looking for something similar with the feature set that TM offers, but nothing comes close (Crashplan did, but it has since stopped its support for non-business users). There’s Arc for macOS, which is rather popular, and is, like duplicacy, able to store in cloud servers, but it also does not have FSEvents support, so it basically is as inefficient as duplicacy, when backing up a folder tree with more than a few 10000 files, but with a better user interface than duplicacy. To beat Arc, duplicacy needs to be more efficient, and adding this feature I look for would make a big difference. Sure, the UI is still not as user friendly (i.e. not Mac-like), but with the web service about to come out, it’ll look great. You could even wrap that into a Mac container app, thereby replacing the current Qt(?) app - I assume that’s what you’re planning anyway.
So, scanning a millon files every 30 minutes is inefficient and unnecessary when I can provide the location of the changes.
In fact, to be more suitable for the use with FSEvents, I’d need two modifiers for the backup command:
- I specify a set of dirs that have their immediate contents changed
- I specify a set of dirs that have any of their contents changed (-> deep scan)
I’d expect the backup scanner to directly dive into these provided paths, skipping any others. That should be easy to add, as I’m sure there’s already a function that scans a dir (and its subdirs). So, all that needs adding is a parameter to that function that tells it whether to look into deeper dirs or not, and then call that function with the provided paths.
If you don’t want to do that, would I be allowed to modify the source accordingly, and you’ll accept it into the main branch, provided I break nothing? Then I’ll write the FSEvents daemon and we’ll see how it performs.