To restore a tiny file, Duplicacy downloads a huge amount of data

Hello, I just restored a 5 kB file using the :d: CLI. It took several minutes, and Duplicacy downloaded over 500 MB to its local cache in the process.

The slow speed of restore has been discussed in several previous forum topics (Restoring is sloooow, Restore (list files) takes too long and others). However, I don’t see any explicit explanation for the huge amount of data needed to restore a single small file.

In one comment, @gchen said:

The bottleneck is in populating the file list, but fixing it is much harder than I thought. First, we should change the order files and folders are listed in the snapshot file, so that we don’t need to download the entire snapshot file to get the top-level files and folders.

Is this the reason for the huge download? Could you please explain as clearly as possible why 10,000x the file size might need to be downloaded to perform a restore?

Thanks for the great software!

1 Like

To restore a single file, Duplicacy needs to download the entire snapshot, which contains mostly metadata for all files. 500MB of metadata aren’t huge if you have hundreds of thousands of files. These metadata are saved in the local cache so next time you restore another 5kB file it will be much faster and won’t need to download 500MB again.

1 Like