Hello, I just restored a 5 kB file using the CLI. It took several minutes, and Duplicacy downloaded over 500 MB to its local cache in the process.
The slow speed of restore has been discussed in several previous forum topics (Restoring is sloooow, Restore (list files) takes too long and others). However, I don’t see any explicit explanation for the huge amount of data needed to restore a single small file.
In one comment, @gchen said:
The bottleneck is in populating the file list, but fixing it is much harder than I thought. First, we should change the order files and folders are listed in the snapshot file, so that we don’t need to download the entire snapshot file to get the top-level files and folders.
Is this the reason for the huge download? Could you please explain as clearly as possible why 10,000x the file size might need to be downloaded to perform a restore?
Thanks for the great software!