Multiple backupsets from a common root/repository - or multiple prefs - or multiple filters :)

So far, if I understand everything correctly, we can only have one backup set based on the folder where the backup repository is initialized for?
We can either just init it and get the prefs stored in the .duplicati folder at the root of the repository or use the -pref-dir option to save the preferences/filters etc. for the backup to another location than the root of the repository. But then we get a .duplicati file pointing to the folder defined using -pref-dir. That’s probably meant to be able to have all the backup prefs located close to each other which I quite like.

But I think we’re still limited to a single backup configuration per repository root folder, or did I miss something? We can’t just add a -pref-dir to all other commands and be able to have multiple backups/filterlists for the same root folder?

In my current situation f.i. I do have most of my relevant data in a common/projects folder - but within this I do have active folders I change multiple times a day and others that may be stable for weeks. Another one are the Outlook.pst files, even after a simple mailcheck it’s often usually around 100MB that have to be uploaded to the storage so a backup ever hour is a bit demanding compared with the rest of the changed files that might just be a few kb in a .txt file.

Please tell me if I missed something, or, if not, have you thought about adding the option to provide the -pref-dir for all commands?

Another option would be to just add a -filter option to the backup command, so instead of alway using the same filter file it would be possible to use different filters to define the files included in the backup (which, I guess, basically is the same as multiple backup sets, just rooted in a common repository)?

I think the -filter option is the best. Can you create a github issue for this feature?

In general, it’s no problem to modify the filters before any given execution of a backup command in a repository?

This would mean, given I experiment mostly with the CLI version anyway, I could already fake this by creating a few filters files, like f.i. filters_hour, filters_day, filters_media … and, before starting a backup I could just copy the chosen filters_* file to .duplicacy/filters and then start the backup?
Am I missing something? Any potential problems? Done in a batch file this would be as simple as the regular backup using CLI - so nothing additional to think about after the initial setup.

You can do that but that is too error prone. If there is an error in copying over the filters file then all of sudden you’re backing up a completely different set of files. A -filters option for the backup command would be much safer.

Of course you’re right, any potential source of errors should be removed if possible. It’s just me still toying around with Duplicacy to see what’s possible already and how I’ll use it eventually. Also I don’t expect any feature request to be implemented at all and even less in a short timeframe - so right now there’s either this workaround or not use Duplicacy this way at all.

I’ll still add an feature request / issue on github.

Was this option added? I also have a scenario where I would like to have different files backed up based on storage URL. For example, a different data-set for files backed up to the cloud vs files that are backed up to local storage. I do not see a “-filters” option for backups on the wiki page.

Kind of - please check the -repository option added with version 2.1.1.

You can find additional in this thread "repository" init and add options

1 Like

Thank you for the response. I am not totally sure I am understanding the use of the “-repository” flag. Especially, now that I already have run the init command and have performed multiple backups.

In my case, I ran the -init command from the “/” root directory and specified a “pref-dir”. In my root directory a .duplicity file was created where the contents of that file include a single line, the path that was configured with the “pref-dir” option.

Now in my “pref-dir” directory I have my preferences file and filters file.

Now, I would like to have two different backups performed of the “/” root file system based on the storage. So larger files that are not totally critical could be excluded for cloud backups, but would be included in backing up to local NAS storage.

So my solution now, which is not great, is to have multiple filter files, and for the actual filters file be a symbolic link to one of the filters files. Before running the duplicacy command I link the filters file with the correct filter file I want to use based on what I am trying to.

Now, that everything is set up, I am not 100% how I add the “repository” option to my existing preferences file. Do I delete the .duplicacy file in the “/” root directory, add the “respository” option in the preference file to contain the path of a unique directory, and then create a filters file in each of those directories?

I like the idea of either specifying the filters file on the command line and having the “filters” option in the preferences file.

Unfortunately I don’t know if it would work reliably to just modifying your backup to use the -repository option. In theory I’d guess it should (i.e. by manually modifying the preferences file) but I did not try anything like that, I just setup new backups to use the option and kept running the old ones until the new ones did a full run.

The folder setup I use, in the most simple form, looks like so

backupprefs/
+full/
+activeprojects/

I cd into full/, run the init command with no -prefs-dir but the -repository option pointing to the root of my repository. Then the same for activeprojects with the same repository root.
The result is, in both folders a new .dupliacy folder with the preferences, filters etc. files will be created and the preferences file contains the path from the -repository option during the init.
Next step is to adjust the filters file in both of these new .duplicacy folders to my needs.

After this setup, all that’s required to run one of those backups is, cd into fullbackup and execute dupliacy backup.

I think it’s even better than using different filters at different times, as it’s more transparent what’s included in which backup.

As mentioned in the beginning, you might get away with just adding/modifying the “repository”:"", entry in your prefsdir to point to your repository root (i.e. the contents of your .duplicacy file in your repository) but I have no idea if there’s anything that could potentially go wrong.

Thank You.

I believe I have successfully used the -repository option on a couple of other servers. It would appear to be functional, but having a -filters-file option would be more intuitive in my opinion.

I guess it comes down to personal preferences … even though the -filters request was initially written by me, I’m even more happy with fully separated backup setups.

It also may help if you need to restore from a backup as you don’t have to think about which filter you may have used for which revision to find the relevant files your interested in.