Use duplicacy in combination with Rclone?



Some background of my previous setup:

  1. All home computers backup/one way sync to my Synology NAS (~3TB, mix of small and big files)
  2. The NAS also stores my static media files (~18TB, mostly large files)
  3. One headless Windows Server backs up and encrypts all files stored on the NAS (via network drive) to CrashPlan

The issue with this is CrashPlan, it just doesn’t want to work half the time and the support sucks. Crashplan used to be good, but ever since they moved to the Small Business model the software just won’t work without constant re-installs and re-configuring.

Future plan #1 - All Duplicacy:

  1. Same steps 1 - 3 from first setup
  2. Use Duplicacy to backup and encrypt all files to Google Drive (I already have unlimited storage there)

This seems like the most simple option but I am worried about being unable to recover single files from a large backup. If I did it this way I could create multiple backup sets but it would still take a while since each set would range from 2-10TB.

Future plan #2 - Duplicacy for computer backups, Rclone for media backups:

  1. Same steps 1 - 3 from first setup
  2. Use Duplicacy to backup and encrypt all files from step 1 (mixture of files ~3TB, multiple sources)
  3. Use Rclone to backup all files from step 2 (relatively static media files, ~18TB)

This sounds like the most ideal option, but is a little more complex to setup. If I use Rclone for the large backup set I can recover a single file individually if necessary. In addition, I don’t need multiple versions or de-duplication for my media files.

I want to go with future plan #1 but I have a feeling I’ll need to do #2. I was just curious what other people do for large media file backups.


Go with #2.

I am, and was, in the exact same situation as you. Ex-CrashPain user, backing up local and network share (mklink trick), both regular files to backup plus media. Google Drive unlimited, although this wasn’t until after I left PlaneCrash…

Duplicacy can backup your media, but the extra overhead of chopping up files in tens of GB size range, into small chunks, even if you increase the default chunk size, is kinda pointless. You’ll get no de-duplication for the extra hassles of recovery. You can restore individual files from Duplicacy backups just fine, but listing those files is awkward, or mounting those files for easy access is… not yet possible.

rclone is perfect for the job of handling media to cloud. Plus you can mount an encrypted repository.

Duplicacy is perfect for the job of handling versioned backups to the cloud.

Plus they work wonderfully together (independently in fact). :slight_smile: Seriously, you want to go with plan #1 but after using Duplicacy for a bit you’ll eventually want #2.


Just to add some details about my setup…

  1. Local PC backups with Duplicacy are done to my Windows Server (NAS).
  2. Windows Server shares are backed up directly from the server with Duplicacy to GD.
  3. Local Duplicacy storage on server is copied to the same Duplicacy storage on GD.
  4. rclone copies media to a different location on GD, encrypted on-the-fly.

This provides localised and off-site backups.

Plus I have SnapRAID protecting against single drive failure on the server. And StableBit DrivePool to present it all as one drive.



I have exactly the same use case. For my folders that require versions I use Duplicacy. For my media library (music and movies) I use Rclone (with --backup-dir option).

I agree with all the advantages and characteristics mentioned by @Droolio


How do you handle simple version history with rclone? Just use the --backup-dir option? Just in case a media file accidentally gets deleted.


Exactly, I use --backup-dir.
I run Duplicacy with a daily schedule, but Rclone I run manually, right after adding a set of musics or movies.