That’s correct. If the file was rewritten by a new export, I’d want Duplicacy to reflect that in the backup set so that it’s correct when restored.
The default behavior (detecting by size and timestamp) is fine because even though the timestamp has changed, the contents will still become the same file chunk. Duplicacy will see that it already has that file chunk in the repository and won’t upload a fresh copy. It will upload a new metadata chunk reflecting the new timestamp, but that’s less than a megabyte in aggregate if the backup set includes a few thousand snapshots.
You can see this in action on a small scale using my Duplicacy Sandbox. See my comments, which begin with //
:
// get set up
$ make build
mkdir -p storage/default
mkdir -p root
// Create a file and back it up.
$ echo "foo" > ./root/foo
$ make backup
#
# Backing up to default
#
Storage set to /tmp/duplicacy-sandbox/storage/default
Downloading latest revision for snapshot sandbox
Listing revisions for snapshot sandbox
No previous backup found
Indexing /tmp/duplicacy-sandbox/root
Parsing filter file /tmp/duplicacy-sandbox/root/.duplicacy/filters
Loaded 0 include/exclude pattern(s)
Packing foo
Packed foo (4)
// New file chunk
Uploaded chunk 1 size 4, 4B/s 00:00:01 100.0%
Uploaded foo (4)
Listing snapshots/
Listing snapshots/sandbox/
Listing chunks/
Backup for /tmp/duplicacy-sandbox/root at revision 1 completed
Files: 1 total, 4 bytes; 1 new, 4 bytes
// One new file chunk and three new metadata chunks
File chunks: 1 total, 4 bytes; 1 new, 4 bytes, 13 bytes uploaded
Metadata chunks: 3 total, 331 bytes; 3 new, 331 bytes, 359 bytes uploaded
All chunks: 4 total, 335 bytes; 4 new, 335 bytes, 372 bytes uploaded
Total running time: 00:00:01
// Rewrite the file with the same contents, which will have a different timestamp
$ echo "foo" > ./root/foo
$ make backup
#
# Backing up to default
#
Storage set to /tmp/duplicacy-sandbox/storage/default
Downloading latest revision for snapshot sandbox
Listing revisions for snapshot sandbox
Last backup at revision 1 found
Indexing /tmp/duplicacy-sandbox/root
Parsing filter file /tmp/duplicacy-sandbox/root/.duplicacy/filters
Loaded 0 include/exclude pattern(s)
Packing foo
Packed foo (4)
// This chunk is already in the backup set, so no upload:
Skipped chunk 1 size 4, 4B/s 00:00:01 100.0%
// This is a little misleading:
Uploaded foo (4)
Listing snapshots/
Listing snapshots/sandbox/
Listing chunks/
Backup for /tmp/duplicacy-sandbox/root at revision 2 completed
Files: 1 total, 4 bytes; 1 new, 4 bytes
// No new file chunks were uploaded but one new metadata chunk was:
File chunks: 1 total, 4 bytes; 0 new, 0 bytes, 0 bytes uploaded
Metadata chunks: 3 total, 331 bytes; 1 new, 260 bytes, 269 bytes uploaded
All chunks: 4 total, 335 bytes; 1 new, 260 bytes, 269 bytes uploaded
Total running time: 00:00:01