Duplicacy is causing HUGE server load when it backs up and it is causing my server to crash. Is there a way to limit the usage of duplicacy on resources? Thx
What kind of a server is this?
Huge load on what (e.g., disk I/O, CPU, network, memory)? Can you quantify what “HUGE” means?
What does this mean? Your server powers off? Exits with an error? If so, what kind of error?
Yes. I think the current main ones are:
-
-threads 1
should limit duplicacy to a single thread - `-limit-rate <kB/s> should limit disk I/O, CPU, and network
CPU load, server exits forcibly with a bunch of complicated stacktraces about memory.
What kind of hardware and OS is the server using? Like a raspberry pi or something with a single CPU core? I don’t think duplicacy uses many threads by default, but I suppose it could max out one or two threads with default settings under certain conditions.
If it’s just CPU load (and not memory), using -limit-rate
with a small enough value (maybe 100 kB/s to start as a sanity check) might be enough to throttle CPU usage.
If it’s related to memory usage, I’m surprised the OS isn’t just killing the duplicacy process. Actually crashing the server sounds kind of a like a hardware issue. If memory usage is a factor on your server for the number of files you’ve selected to backup, this comment gives me the impression that some memory optimizations are in the near future.