Guys,
On my Windows 11 desktop I have about 2 TB of data that I backup regularly. Usually duplicacy, as expected, scans the files, uploads the changes and the backup is finished in a very short time.
A couple of days ago, I had the idea of rearranging the data in a more logical way, so I moved around a few directory, maybe 7-800 Gb of data. Since then, duplicay is detecting all the files as changed and every backup tries to upload ALL files, so it takes forever even if there are basically no changes and, in the end, duplicacy only saves a few MB of data.
I am using the Duplicacy GUI and, in the end, I decided to stop the “Duplicacy Web Edition” service, deleted everything in the .duplicacy-web directory (storage, backup configs, cache, logs, literally everything) and restarted from a blank slate. Same story.
This is an example from a test backup that I created:
2026-03-26 07:37:00.997 INFO BACKUP_STATS Files: 5075 total, 3,874M bytes; 5069 new, 3,874M bytes
2026-03-26 07:37:00.997 INFO BACKUP_STATS File chunks: 3 total, 3,905M bytes; 0 new, 0 bytes, 0 bytes uploaded
2026-03-26 07:37:00.997 INFO BACKUP_STATS Metadata chunks: 3 total, 808K bytes; 0 new, 0 bytes, 0 bytes uploaded
2026-03-26 07:37:00.997 INFO BACKUP_STATS All chunks: 6 total, 3,906M bytes; 0 new, 0 bytes, 0 bytes uploaded
2026-03-26 07:37:00.997 INFO BACKUP_STATS Total running time: 00:00:14
This is only a few thousand files, so the whole process takes only 14 seconds, however you can clearly see that duplicacy thinks that 5069 out of 5075 files are new, although I didn’t even look at them.
Any idea? In this case, I didn’t specify any flags in the backup.
Thanks.
ETA: I searched the forum and I found others that had similar issues, but no solution.
