I think this may be a feature request, but putting it under support if there’s a way around this already.
Duplicacy is failing for me due to some locked files. This is happening with the -vss option (though I’d prefer not to use it at all so a non-admin user can run the backup jobs). I know what files are locked - they’re always some process-specific memory-mapped files which I don’t necessarily care if they get backed up or not.
I do have the option of messing around with include/exclude patterns to ignore these, but would prefer if there’s a way I can tell Duplicacy to just ignore any file read or locked file errors and move on. Currently it’s terminating with exit code 100:
Storage set to \\srv07bck01\backup$\SRV12UTIL01\duplicacy
No previous backup found
Creating a shadow copy for C:\
Shadow copy {EF0B671F-3CEF-4AED-826E-EC046AE529AB} created
Indexing C:\Backups\BackupSources
Incomplete snapshot loaded from C:\Backups\BackupSources/.duplicacy/incomplete
Listing all chunks
Skipped 38127 files from previous incomplete backup
Skipped chunk 2 size 3507432, 3.34MB/s 02:02:11 0.0%
...
Skipped chunk 27 size 1861202, 124.47MB/s 00:03:17 0.5%
Failed to read 0 bytes: read
C:\Backups\BackupSources/.duplicacy\shadow\\Backups\BackupSources/repo-home/var/cache/Configs/repo/data077991.bin: The process cannot access the file because another process has locked a portion of the file.
The problem with include/exclude filters is that these filenames change, and may exist in different locations (which may include files which we want to be backed up). The other option for us is to do some dev work on the components that are creating these files in the first place, so they can all go into the same folder and/or use a consistent filename.
Thanks,
-Alex