Some newbie questions - backup organisation

I was expecting this is the way to go in Duplicacy.

So does that mean, at that moment, where I include (i.e. whitelist) a subfolder via filter, at that moment it will not include all other subfolders?
Or do I always have to explicitely include/whitelist subfolders?

Looks I need to read a bit more :smiley:

Let’s say your folder structure looks something like:

folder/
├── music1/
├── folder-foo/
│   ├── file-a.txt
│   ├── file-b.txt
│   └── file-c.txt
├── music2/
├── folder-bar/
...

The filters file of this repository could be:

+music1/*
+music2/*
-*

The last line will EXclude everything else.

Take a look:

1 Like

Ah yeah, also saw, that web-UI is quite helpful. But that is an unexpected order! I’d expect it to read top down. I.e. without any knowledge, I’d use:

-* (Exclude all at first)
+include1/*
+include2/*

Your example:
+music1/*
+music2/*
-*

reads like: Include music1/music2, then exclude everything. I.e., do nothing.

And that’s the way it works!

This way you propose:

the first line would always be found first and the other lines would never be read.

I thought now I’d understand its way of thinking but I was proven wrong, because it does nothing. I thought this is the right way of thinking:

Duplicacy goes through all files and folders. I have to think from each individual file/folder perspective. Say “C:\folder1\Iamafile.txt”. Now it looks into filter line 1. Say “+blah”. It does not match. Next filter line. “-*”. Match. File is excluded, no further processing. Next file.

Wrong way of thinking?

So now my situation is this:

M:\ M:\Folder1\ M:\Folder2\ M:\FolderFoo\

I want to include anything. The only thing I want to exclude is M:\FolderFoo. I added via the web-ui:

-FolderFoo/ +*

Does nothing. I am disappointed. My whole thinking was wrong. Also I am too stupid to add line breaks here in code tags.

I hope my thinking was right. After failed attempts, I always deleted Repositories via web ui, also deleted Storages, and there was a mess of filter files and other files left in .duplicacy-web\repositories\localhost…
That may have been my cause of struggling.

I’ve cleaned that up, restarted httpd, now it is working with my expected filters. Will see at restore time, if it included the folders I want.

In restore - Are backups only listed after totally finished? It says “No previous backups for this backup ID”.

See Scripts and utilities index.

I find myself making more use of NTFS links (Symlinks) in order to include stuff that is not on the same file system level. I’m OK with this. Average user could not do this. A more flexible, organisation based include system would be nice.

I think I was similarly confused. Did you see this topic:

It’s because @Tortuosit was reading like this:

While the correct logic / reading is:

I (Duplicacy :grinning:) am backing up a file whose path matches music1/*. OK! Stop reading the filtering pattern and back up.
I’m backing up a file whose path matches music2/*. OK! Stop reading the filtering pattern and back up.
I’m backing up a file whose path does not match either music1/* or music2/*. Ah! But there is a third filter that says “all” (*), and it says to exclude, so I won’t do anything. Next file!

And there is an important detail that I took a while to understand when I started using :d:, but it is essential to consider when you are creating your filtering logic:

This is a good example:

2 Likes

Sorry, I hadn’t noticed this doubt before.

Use four spaces

https://www.markdownguide.org/basic-syntax/#code-blocks

or fenced code blocks

https://www.markdownguide.org/extended-syntax/#fenced-code-blocks

1 Like

Thanks for clarification. Yes, I meanwhile understood I need to think from the individual file perspective exactly as you wrote. Filtering order makes sense then. But I think it is unuasual. But easy as well.

I came from a result list based thinking. Like:

  • At first exclude/blacklist all -> So we have an empty list as a start
  • Then do whitelisting, i.e., add all matching files to the list
  • Then we have the finished list: Start backing up what is in that list

Will read it, thx. Meanwhile I have understood the way filters work here.

Ok, just remember that you will have to place the include/whitelisting patterns before the “exclude/blacklist all” pattern you created in the previous step you mentioned.

So now it looks like I have not understood. sigh

Filters:
-folder1/
+*

This will not include anything if my testing is not broken. But from my file perspective:

“I am file \bleh.txt”
“Do I match -folder1/ ? No - Continue”
“Do I match +* ? Yes. - Include me”

This does backup nothing, no matter if +* is on top or at the bottom. Will study later. The god of pain is arising…

-/ntuser.dat
-/appdata/local/
+*

No included files.

+*
-/ntuser.dat
-/appdata/local/

No included files.

Yes, I know it’s confusing…:wink:

But it’s because the first example was with full EXclusion at the end

and the last example was with full INclusion at the end:

Your understanding is completely correct.

Let me put it in a wider perspective:

Include or exclude parameters for specific folders/files must be placed before wildcard parameters.

You might want to remove the / at the beginning of the paths, but that doesn’t explain “no included files”. Are you sure that no files are included? Anf if so, that there are actually files to be included?

Thanks for your reply, I wrote in some other thread: Cause was, I had symlinks to the real backup target NOT on my backup root level.