Insane memory/cpu usage with large repository

Here is a screenshot:

This causes me unable to almost perform basic tasks on my pc anymore.

I should also note that i given the command to duplicacy to close after this happens> So it doesnt actually close but the task remains.

How much memory was it using? It is hard to tell from the screenshot.

And how many files are in the directory to be backed up?

It uses 12,3 gigabytes RAM from that screenshot.
Its backing up almost all files on PC exept for temp/windows files. So tens of thousands at least but out of those many are probaly skipped due initial backup being completed.

This is a design flaw. The entire file list has to be loaded into memory to create the current snapshot and to compare to last snapshot. I plan to fix this issue in the next major update.

1 Like

I think I may be seeing this problem on a FreeNAS box. After a while, the machine hard locks as a result of running duplicacy. Similar to out of memory, but the machine is in such a terrible state I can’t do much to check RAM usage.

is there a work around?

One workaround is to split the repository into multiple smaller repositories. This should definitely reduce the amount of memory usage.

1 Like

They can then all share the same b2 bucket eh?

Do you have an ETA on the above-mentioned next major release?