Restore single directory to clean state

Hi! I’ve been using duplicacy for a some time now, for the last 4 years actually. It’s been a great tool with marvelous backup speed. However, I’ve never had a true disaster recovery use case until yesterday, and I wasn’t able to solve it to my liking. So I am wondering if there is a workflow that solves my use case elegantly.

My situation can be described as:

  • My duplicacy repository is my home directory
  • Inside my home dir, there is a folder called photos. It holds about 100GB of media files.
  • I was about to start a bulk renaming job that changes the naming schema of all .jpg files in this directory.

My order of action was this:

  1. I realized there was a lot of room for mistake ahead, so I created a duplicacy revision before embarking on my journey (revision 100)
  2. I then started the renaming job, but due to an unhandled edge case, about 20% of the photos received a bogus file name.
  3. I only noticed the error after creating a new revision 101.
  4. So now I’d like to restore the photos directory to the clean revision 100 via duplicacy restore -r 100 -delete -overwrite "photos/*"
  5. According to the documentation, the restore command ignored the delete option as I gave a filter pattern. However, that means duplicacy just restored the clean state into the faulty state, and 20% of my photos are now duplicated with different names.

What I was expecting is a workflow similar to a git reset --hard: That my photos directory gets restored to the clean state of revision 100. How could I achieve this? Of course I could have completely wiped the photos directory beforehand, but then I’d have to download a whopping 100GB when restoring from revision 100.

Any guidance appreciated!

I think it’s a bug.

Duplicacy should still honor the -delete flag within the scope of the filter: if the existing file on disk matches the filter, does not exist in the revision, and -delete is specified — it should be deleted.

Additionally, Duplicacy could limit all operations to the current directory: find credentials up the tree and only change files in the current directory and below, as a convenience shortcut for the previous feature.

For your case to workaround today without redownloading gigabytes of data, and assuming photos are the bulk of data, I would rclone everything (outside of photos) to some location outside of the user home, restore, and then move that data back.

This however may still result in significant amount of data downloaded, depending on how much was renamed. For these reason I would use local restore facilities instead. Do you not have Time Machine that keep local snapshots enabled if on a Mac, or similar albeit less usable drive snapshotting if on windows? It would be much easier to restore from local storage than fetching data from the internet; the remote backup is best considered a disaster recovery plan, unless you have a gigiabit or faster internet

1 Like