High CPU usage till backup gets killed

Hey guys,

For far loving duplicacy in web edition. I’ve Got a couple of backups set. Some are working fine. I’ve got two backup sets everytime I start them after some time the CPU Usage of my NAS get super high basically crashing almost all active docker containers. After some time they get killed automatically. Any one an idea what’s going on?

I’ve tried to limit cpu usage by docker which doesn’t seem to really work. Any other options to reduce cpu usage of duplicacy?

Best regards.
Tom

Does this happen in the file listing phase (when the progress bar shows indexing) or in the chunk uploading phase (when the progress bar shows upload speeds, remaining time, etc)?

In the second phase…

How many uploading threads do you use? By default Duplicacy uses one thread to read and split files and another thread to encrypt and upload chunks, and this should not be cpu-intensive.

Default is set. Is there a more detailed option in Duplicacy to follow backup logs? It just says it was killed…

Sadly a process doesn’t get notified before being killed, so there isn’t a way to print more detailed logs. Maybe you can find something in the system log?

How does memory utilization look right before a process is killed?

How much ram (plus swap) do you have there and what is the dataset size and number of files selected for backup?

Definitely crashed short time after starting uploading. Cannot see the reason but all cpu cores suddenly gets a super high usage. Trying to disable a few other docker containers while backup for testing… which did not help. Suddenly while checking system usage with htop all processors are fireing at 100% while in uploading phase. Weird stuff. Also tried without docker system limitation. Still the same…

Interesting while scanning which is supposed to be the more heavy cpu usage part it stays around <50% running fine.