I am struggling to backup my new Synology NAS using duplicacy because of memory issues. I’ve read some of the threads here about this, including https://forum.duplicacy.com/t/duplicacy-check-out-of-memory/1041 and https://forum.duplicacy.com/t/memory-usage/623.
I have 8GB memory, and duplicacy backup gets killed by the kernel as it goes over that amount of usage early after indexing files and early in the actual backup process.
There are about 13M files in the backup.
The only suggestions I could see that might make a difference are:
- To wait until the duplicacy engine is updated to not store all file names in memory (unknown when, this was discussed in 2017)
- Split up my backups
I was looking at splitting up the backups, but this will be a mess, since most of the files (> 10M) are in a single top level folder “Archive”. To split the backup up, I’d have to dig into and split things up at the second level of the hierarchy, and that’s just a mess.
I am looking for other solutions. If you have ideas, please advise.