Why does this filter block any backup, no matter what?

Hi, I wanted to do a really simple backup. A simple filter. Really simple.
So before fully studying, digesting and understanding Improving the instructions for include/exclude patterns (filters)
…my opinion was, if I either put +* at the top or bottom, at least in one case the process should backup something. But: No files are backed up. The filters destroy it. Why? I’m using NTFS links because Duplicacy doesn’t offer to organize cluttered stuff into one backup set.

Folders to backup:

c:\backup\duplicacy\definitions\MainBackup\C\Symlink1
c:\backup\duplicacy\definitions\MainBackup\C\Symlink2
c:\backup\duplicacy\definitions\MainBackup\M\Symlink1
c:\backup\duplicacy\definitions\MainBackup\M\Symlink2
c:\backup\duplicacy\definitions\MainBackup\S\Symlink1
c:\backup\duplicacy\definitions\MainBackup\S\Symlink2

I.e., the root backup folder is MainBackup.

My filters are:

+asterisk
-asterisk/ntuser.dat
-asterisk/appdata/local/asterisk

No files are backup up.

Another order:

-asterisk/ntuser.dat
-asterisk/appdata/local/asterisk
+asterisk

No files are backup up.

My only workaround is to not use filters. EDIT: Not a workaround.
EDIT: replaced * by “asteisk” because another field of studying (forum software).

Additional question: the only filter line is “+*” - and it still backs up nothing. Please someone help me demystify this complex filter. My backup has now 9 revisions and zero files.

Running backup command from C:\Users\Michel/.duplicacy-web/repositories/localhost/7 to back up C:/Disk/WD50/Backup/Definitions/Haupt
Options: [-log backup -storage GoogleOne -threads 3 -stats]
2019-09-07 13:24:19.736 INFO REPOSITORY_SET Repository set to C:/Disk/WD50/Backup/Definitions/Haupt
2019-09-07 13:24:19.766 INFO STORAGE_SET Storage set to gcd://Backup/Duplicacy
2019-09-07 13:24:23.224 INFO BACKUP_START Last backup at revision 8 found
2019-09-07 13:24:23.224 INFO BACKUP_INDEXING Indexing C:\Disk\WD50\Backup\Definitions\Haupt
2019-09-07 13:24:23.224 INFO SNAPSHOT_FILTER Parsing filter file \\?\C:\Users\Michel\.duplicacy-web\repositories\localhost\7\.duplicacy\filters
2019-09-07 13:24:23.225 INFO SNAPSHOT_FILTER Loaded 1 include/exclude pattern(s)
2019-09-07 13:24:24.234 INFO BACKUP_END Backup for C:\Disk\WD50\Backup\Definitions\Haupt at revision 9 completed
2019-09-07 13:24:24.234 INFO BACKUP_STATS Files: 0 total, 0 bytes; 0 new, 0 bytes
2019-09-07 13:24:24.234 INFO BACKUP_STATS File chunks: 0 total, 0 bytes; 0 new, 0 bytes, 0 bytes uploaded
2019-09-07 13:24:24.234 INFO BACKUP_STATS Metadata chunks: 3 total, 1K bytes; 0 new, 0 bytes, 0 bytes uploaded
2019-09-07 13:24:24.234 INFO BACKUP_STATS All chunks: 3 total, 1K bytes; 0 new, 0 bytes, 0 bytes uploaded
2019-09-07 13:24:24.234 INFO BACKUP_STATS Total running time: 00:00:02

This really stinks. I started the duplicacy httpd as admin, same result. But rights aren’t the problem here anyway. It simply does refuse to backup and refuses to tell why.

I deleted the backups folder on server side in “snapshots/”. Now it takes longer time on “indexing” but also backs up no files. I was so much hoping for not having to invest useless time into backup software :frowning: But it already happens. At least I do not get flooded by warnings as in Duplicati, also the restore was quick. Duplicati loves to start the restore after hours of database action.

I found out: It does not look behind the symlinks, which is regrettable. I have similar backups where it did look behind the symlinks. I can see no difference. They are created via mklink /d.

Jeezes Christ, I found, the Symlinks need to be in the backups root directory. I really hope it follows symlinks which are deeper within the folder structure, because I make use of them quite often.

Hi,
Duplicacy normally treats Windows symlinks as files.
(Except in the root of a repository, which is a special case)
(And also I think symlinks with UNC paths are followed, mentioned in Version 2.1.1 has been released)
(Note: Havent tested symlinks myself)

This is not much highlighted in the documentation, but the Duplicacy User Guide has an article on how to Move duplicacy folder/Use symlink repository, which mentions this restriction rather briefly.

There are some good reasons for this too, but also several opinions, see this topic:

One way to backup your files could be to backup the target data structure with a separate symlink in the repository root, or with a separate repository. One advantage of that second option would be that you could backup that repository on a different schedule and with different prune settings for instance. And of course your case could be too complicated for this, if the symlinks points all over.

Another thing to look out for is that filters for symlinks (specially on root) might need special treatment since they are files, not folders.

4 Likes