Trouble getting initial backup to complete

Hi,
Trying to backup folders from a Windows 2008 R2 server - there is 757GB in 283k files.

I’ve been trying to get an initial backup to complete for almost 2 weeks - Duplicacy runs for days and then eventually I get an error of some kind - the most recent one was an “out of memory” error.

What’s more worrying is that although there is over 400GB now stored in my BackBlaze bucket when Duplicacy starts is says there is no existing backup. Does this means it’s starting from scratch each time - and therefore should I empty the BackBlaze bucket?

Thanks,
David

When the initial backup doesn’t finish, all chunks that have been uploaded will remain in the storage although there is no snapshot to reference them. On the subsequent attempts, Duplicacy will scan all files but skip those chunks that have already been uploaded. This isn’t super fast due to the overhead of the variable-size chunking algorithm, so it will take a while to get to the ‘resume’ point.

This week I’ll be working on a new feature that bypasses this re-scan phase. The idea is to save the list of files that have been scanned in a local file which will be loaded on retries. If you want to keep trying before this feature is available, you can exclude some subdirectories to reduce the size of the initial backup and then remove excludes once the initial backup is done.

HI,
Ok, great idea… I’ll give that a go, although it’s having an attempt to do the full thing again at the moment.

Will keep an eye out for the new version once it’s ready.

Thanks,
David