Memory Usage

Just FYI

  • I tried option # 1 on my Windows Machine with 16GB RAM - but still ran into the same issue.
  • Then tried it out MacBookPro - also with 16GB RAM and it ran smoothly. In both cases I used CLI.

In both the cases - I closed all other apps - but surely there might be services running in background resulting in different RAM available profile.

Has anyone observed significant memory usage differences between different OS platforms?

1 Like

Linux / unix seems to be more conservative on the memory

Hello, I also just got this issue (7-9TB, +2 million files, 4GB RAM, Debian). I am using the web UI. Where do I enter the “DUPLICACY_ATTRIBUTE_THRESHOLD”? I tried to add it in to the globals and options in the backup section but this disables the backup process…

I am really happy with Duplicacy (works perfectly on a 8GB RAM Debian machine)! Thank you very much!

There is no easy way to set the environment variable for the CLI from the web GUI. Your best option might be to divide the big backup job into several small ones, by using different sets of filters for each smaller job.

1 Like

Ok, thank you for your answer. So I have to set up 10 different backups for the subfolders to split it up? Does this reduce the deduplication?

Is there any plan/timeline of fixing this? This problem exists now for over 4 years. :neutral_face:

Thank you very much!

I’m planning a big rewrite of the backup engine and hope to get it done in 2 months.

2 Likes

Awesome! If you need testers feel free to send me a message! Thank you very much!

I too am having memory issues while backing up a large number of files, currently on Web Edition 1.5.0. I understand from a post a few years ago, that you’re working on changing the architecture:

There is no need to load the entire file list into memory at once. My plan is to construct the file list on the fly and upload file list chunks as soon as they have been generated.

Has the rewrite been released yet?

Update: This affects Restore operations as well. If a backup took 64GB of RAM to run, it also seems to take 64GB of RAM to load the file list during a restore. Is this something that’s still being actively worked on, or is the enhancement regarding loading the entire file list into memory already live?

I’ve finished all code changes and is now working on the tests. Sorry for postponing the release multiple times. This summer has been really slow for me, but as kids are going back to school very soon, September will look much better. There should be enough time to finish testing and release the new version by the end of September.

6 Likes

In case I can add another use case, I’m trying to run Duplicacy as a Docker container on a Synology that unfortunately has only 1GB of RAM and I haven’t been able to pass the Indexing phase of a backup as it ran OOM (I think, I can only see “signal: killed” in the logs).
So not saving the list of files in memory it could be a great improvement for me!

Looking forward to the new update and thanks for your effort!

That’s a bad use case though. 1GB is barely enough to run NAS services alone, much less other apps. Using up that ram with non-essential service like backup forces disk cache eviction which completely murders any hope of getting any sort of responsiveness from the storage subsystem. I would strongly suggest getting another compute appliance to run applications to leave RAM on then for for disk cache. There are many other reasons why its a bad idea to run third party apps on a storage appliance without ECC RAM (row hammer comes to mind)

If you do - at least get rid of docker. It’s completely unnecessary and yet it eats up ram for itself and the whole, albeit small, usermode environment. Duplicacy can run natively on any but two synology diskstations. (those exceptions are two PowerPC models). In other words – anything you can do to increase amount of unused ram will do wonders for filesystem performance.

1 Like

Any idea when that will be available? I’m running into the memory issue on multiple systems. Is there a prebuilt beta available I could use instead?

The PR has been submitted: Memory usage optimization

If you need a binary I can build one for you.

That would be helpful, as one of my laptops has a tendency to eat all memory every couple of hours now :slight_smile:

Which OS are you running?

Linux x86_64 - various flavours of Ubuntu and Mint.

Hey @gchen, any idea when you’ll have a little time for that build? :slight_smile:

If you need a more specific flavour, my main workhorse is Mint Una ≃ Ubuntu Focal.

Sorry I forgot about that. Here is the binary: https://acrosync.com/duplicacy-web/duplicacy_linux_x64_2.7.3

That PR has been merged and you can now build your own from the master branch if you want to.

Installed and running - and upgraded to web 1.6 as well :slight_smile:

Thank you for the build!

Thanks! Testing out on my Synology NAS. I created a docker image out of it: Docker Hub