Copying snapshots that are less than N days old

Here’s what I would like to implement:

  • Local storage and a cloud copy
  • Maintain snapshots in the local storage for, as an example, 180 days
  • Maintain snapshots in the cloud storage for 30 days (DR copy only, save $$)
  • Use a daily Copy job to update the cloud storage with newer snapshots and chunks

The problem appears to be that the Copy job will attempt to add back snaps that are older than the 30 day age criteria, correct?

Any creative ideas on how to handle that? Otherwise, my “suggestion box” entry would be to add an aging function to Copy that is similar to the Prune command.

1 Like

I like this idea.

There was talk about Duplicacy adding support for something like -last <n> as an alternative to -r, but an age-related flag would also be useful.

In the meantime, I’m using a simple bash script (below) on a few Linux systems - just to copy the last revision (I don’t need all snapshots in this setup). On my Windows setups, I just copy all the snapshots.

#!/bin/bash
DUPLICACY_PATH='/home/duplicacy/duplicacy'

cd /home/duplicacy/dummy

cmd_copy() {
  REPO_ID="$1"
  REVISION="$($DUPLICACY_PATH/duplicacy list -id ${REPO_ID} | tail -1 | awk '{print $4}')"
  cmd="$DUPLICACY_PATH/duplicacy -v -log copy -to secondary -id ${REPO_ID} -r ${REVISION} -threads 4"
  $cmd > $DUPLICACY_PATH/copy-$REPO_ID.log
}

cmd_copy "<repo-ID>"
2 Likes

Thanks. I’m still crawling/toddling with bash scripts but that looks like it should do the trick!