Can we skip locked or failing files (without VSS)?

I think this may be a feature request, but putting it under support if there’s a way around this already.

Duplicacy is failing for me due to some locked files. This is happening with the -vss option (though I’d prefer not to use it at all so a non-admin user can run the backup jobs). I know what files are locked - they’re always some process-specific memory-mapped files which I don’t necessarily care if they get backed up or not.

I do have the option of messing around with include/exclude patterns to ignore these, but would prefer if there’s a way I can tell Duplicacy to just ignore any file read or locked file errors and move on. Currently it’s terminating with exit code 100:

Storage set to \\srv07bck01\backup$\SRV12UTIL01\duplicacy
No previous backup found
Creating a shadow copy for C:\
Shadow copy {EF0B671F-3CEF-4AED-826E-EC046AE529AB} created
Indexing C:\Backups\BackupSources
Incomplete snapshot loaded from C:\Backups\BackupSources/.duplicacy/incomplete
Listing all chunks
Skipped 38127 files from previous incomplete backup
Skipped chunk 2 size 3507432, 3.34MB/s 02:02:11 0.0%
...
Skipped chunk 27 size 1861202, 124.47MB/s 00:03:17 0.5%
Failed to read 0 bytes: read 
C:\Backups\BackupSources/.duplicacy\shadow\\Backups\BackupSources/repo-home/var/cache/Configs/repo/data077991.bin: The process cannot access the file because another process has locked a portion of the file.

The problem with include/exclude filters is that these filenames change, and may exist in different locations (which may include files which we want to be backed up). The other option for us is to do some dev work on the components that are creating these files in the first place, so they can all go into the same folder and/or use a consistent filename.

Thanks,
-Alex

related: Error: The process cannot access the file because another process has locked a portion of the file. · Issue #309 · gilbertchen/duplicacy · GitHub

Thanks, I’ll mark this as the answer since it’s a known issue…

What you should also do is comment on the github issue and put the duplicacy log from OP.
Not sure if this would help with fixing the bug per se, but it would inform that there’s more people having this problem.

Done, details added to issue.

2 Likes

I think it is certainly possible to skip an unreadable file when no bytes have been read. If the read error happens in the middle of reading then Duplicacy would no choice but to quit immediately.

As we stated in the bug: i think duplicacy should continue backing up all available files, and just note those which could not be read. This implies files which are “half-read” <- those should be added to the “cannot backup” log but the backup should not finish.

3 Likes

+1. Has there been any work done on this yet?

1 Like