Backups fails when coming across some files already in use

Kind of. It should exclude everything where “Cache” is the last word in a folder name. Note the space in the filter and capitalization.

had to add some lines again and again, and now the backup still fails.
But strangely, without errors in the log:

Blockquote
Running backup command from D:/Cache/localhost/0 to back up C:/Backup Symbolic Links
Options: [-log backup -storage Google_Cloud_admin_gdteam -vss -threads 16 -stats]
2020-12-17 00:00:06.554 INFO REPOSITORY_SET Repository set to C:/Backup Symbolic Links
2020-12-17 00:00:06.725 INFO STORAGE_SET Storage set to gcd://Duplicacy Backup
2020-12-17 00:01:06.584 INFO BACKUP_START Last backup at revision 7 found
2020-12-17 00:01:06.585 INFO VSS_CREATE Creating a shadow copy for C:
2020-12-17 00:01:10.984 INFO VSS_DONE Shadow copy {42E92F72-6FEA-41F1-8F9D-ECD0B17CC263} created
2020-12-17 00:01:11.022 INFO BACKUP_INDEXING Indexing C:\Backup Symbolic Links
2020-12-17 00:01:11.022 INFO SNAPSHOT_FILTER Parsing filter file \?\D:\Cache\localhost\0.duplicacy\filters
2020-12-17 00:01:11.022 INFO SNAPSHOT_FILTER Loaded 13 include/exclude pattern(s)
2020-12-17 00:13:00.080 INFO BACKUP_THREADS Use 16 uploading threads

That’s it. That’s the log. It simply says “failed x hours ago” in the backup menu and shows the above log when clicking the failure message.

If you want to exclude D:/Cache, the exclude pattern should be -Cache/ or -Cache/*. Patterns are case sensitive and since Cache is a first-level child, `-/Cache/ would not work.