I’ve been a personal user for a bit over a year now, and I’m very impressed.
At work we’re starting to talk about potentially backing up a couple of key folders from every one of our user’s laptops/desktops (Our servers, etc are already covered in other ways) in the event that a ransomware attack managed to get through all of our other layers of security with a 0-day. There are obviously other scenarios such a backup would be helpful for, but ransomware recovery is the one that might get the budget approved…
I was wondering if anyone has any experience using Duplicacy on 600-700 devices in a corporate setting? After reading the forum’s for the last few hours, it seems at least theoretically possible, but I couldn’t find any posts of anyone doing it, or even talking about it.
In my head, I think I’d use the CLI version and handle everything in the background so my users didn’t even know it was running. I’d use B2 storage (or maybe Azure, if I could get the permissions to work) with an application key that’s configured for just writing files, no read, so that no one who found the application key on the machine (my employees included) would be able to see the rest of the contents of my corporate backup. (I think from what I read this can be done.). Any command that requires read/delete access would be done from a server IS controls that I trust. If I could make this work in Azure, I’d do it from a VM in Azure to keep egress costs down.
Ideally I’d set this all up to be automatically set up based on some workflows in our ITSM tool (ServiceNow), and maybe we’d even build a workflow in ServiceNow that would allow someone to find and restore their own files in the event they lost their machine or lost a file they were working on. This might require wrapping a REST service around the CLI, but that shouldn’t be too hard.
I’d guess that on average I’d be looking @ about 0.2 TB of data to backup per machine in the beginning that might, with the pruning I have in mind take up about 0.5 TB of data for each user, or 350TB total. With that in mind, I’d love it if someone could tell me that they have storage set up somewhere that’s 2X or 3X that so I’d have significant overhead for growth.
If a 700TB bucket in B2 would be insane for Duplicacy, I could chunk the machines down into groups of 10 or 25 or 50, etc but I’d lose some of the duplication checking that I’d imagine would be fairly advantageous over the whole company.
All of these numbers are subject to change as soon as I really start planning this, unless this absolutely won’t work of course.