Out of Memory on backup, Revisited

I am struggling to backup my new Synology NAS using duplicacy because of memory issues. I’ve read some of the threads here about this, including https://forum.duplicacy.com/t/duplicacy-check-out-of-memory/1041 and https://forum.duplicacy.com/t/memory-usage/623.

I have 8GB memory, and duplicacy backup gets killed by the kernel as it goes over that amount of usage early after indexing files and early in the actual backup process.

There are about 13M files in the backup.

The only suggestions I could see that might make a difference are:

  1. To wait until the duplicacy engine is updated to not store all file names in memory (unknown when, this was discussed in 2017)
  2. Split up my backups

I was looking at splitting up the backups, but this will be a mess, since most of the files (> 10M) are in a single top level folder “Archive”. To split the backup up, I’d have to dig into and split things up at the second level of the hierarchy, and that’s just a mess.

I am looking for other solutions. If you have ideas, please advise.

1 Like

Did you try setting the environment variable DUPLICACY_ATTRIBUTE_THRESHOLD to 1 before running the backup command? This instructs Duplicacy not to load extended attributes into memory when constructing the file list so it may help with the memory usage.

I’m testing this now to see what difference it makes.

Unfortunately this still does not work - it ran out of memory and got killed. It seemed to reduce memory usage by a gigabyte or two, but that is not enough in this case. I think there are many files with no xattrs.
I have found a sub directory with ~6M files to exclude on the first run, that makes it work, but it is messy to have to do it this way.
I hope you will consider fixing this. 12M files is not that many for a 40TB RAID.
Thanks

Hi @gchen or others - any further thoughts on this?

You can set up 2 repositories elsewhere and use the -repository option to point to the actual directory. Then you can use different patterns for each repository.

I’ll get to the memory optimization in about 2 months. This is perhaps the last major piece that Duplicacy needs.

3 Likes