Pre Command and Post Command Scripts

You can instruct Duplicacy to run a script before or after executing a command.

For example, if you create a bash script with the name pre-prune under the .duplicacy/scripts directory, this bash script will be run before the prune command starts. A script named post-prune will be run after the prune command finishes.

This rule applies to all commands except init.

On Windows these scripts should have the .bat extension in their names, while on linux they should have no extension.

Beware, with the web gui edition, scripts for backup need to be in ~/.duplicacy-web/repositories/localhost/n/.duplicacy/scripts with n the number of the backup. All others pre-prune, post-copy, etc. need to be in ~/.duplicacy-web/repositories/localhost/all/.duplicacy/scripts.

4 Likes

This is actually pretty easy to do from both linux and windows from a separate batch or sh script that can be used to run all of your scripts in a row. This can also be done in Windows via the Task Scheduler to run one script after another at a certain time. Possibly with cron in linux, but I havenā€™t messed with cron jobs in a while.

1 Like

Ha. I didnā€™t realize this was a how-to and not a feature request. Well, hereā€™s another option anyway.

1 Like

5 posts were split to a new topic: Will post-script run no matter what?

@james1 and I knocked up a simple pre-backup script that can check:

  • The machine is on AC power
  • Your storage server is reachable
  • You are connected to your home Wifi

The example script is here: A pre-backup script for using with Duplicacy backups Ā· GitHub

The checks can be separately enabled / disabled, and if an enabled check fails it will produce an error code 1, which will cause Duplicacy to abort the backup.

If you have several backup jobs (like me) and you want to use the same pre-backup script for each, you can create the pre-backup script once (I created in ~/.duplicacy-web/repositories/localhost/), and then symlink it into the individual backup script directories using this command:

sudo ln -s ~/.duplicacy-web/repositories/localhost/pre-backup ~/.duplicacy-web/repositories/localhost/0/.duplicacy/scripts/pre-backup

That way, if you want to change your pre-backup criteria, you only have to change it in the one script file.

I am using Duplicacy Web UI - if youā€™re using the CLI / GUI version your script locations will vary.

7 Likes

The ~/.duplicacy-web/repositories/ folder is a temporary folder. It is generated based on the json file and the filters from the ~/.duplicacy-web/filters/ folder.

So, where could scripts be securely stored in the web version? Something like a ~/.duplicacy-web/scripts/ folder? Or referenced in the json file?

1 Like

Right, you can put scripts under ~/.duplicacy-web/repositories/*/.duplicacy/scripts but they are not persistent. Youā€™ll need to save a copy somewhere else.

In the feature the web GUI will support pre/post scripts per schedule.

1 Like

Maybe this area should be better documented (if schedules wonā€™t be added to the Web UI soon).

Itā€™s hard to put various bits and pieces togther. In some threads (now locked) that come up ranked higher in forum search results for post pre scripts thereā€™s outdated info about script paths for Web UI users:

Elsewhere a ā€œmega scriptā€ that was suggested, but it canā€™t run (or can it?) for Web UI jobs, so thatā€™d mean having two sets of backup jobs (one for CLI and another for Web UI) or perhaps using the Web UI only for monitoring/checks.

Yet elsewhere a way to have pre- and post-scripts integrate better with Duplicacy was suggested but not committed (that PR was closed before it was merged).

Whatā€™s the difference between ā€œscripts for backupā€ and ā€œall othersā€ (pre-prune, etc.)? Dooes ā€œall othersā€ mean all scripts not related to specific backup job/number (maybe to check for network connectivity, rather than anything related to files, directories or job targets involved in backup job)?

Does that mean on Linux any script file named pre-$STRING (no extension) would run before and any file named post-$STRING would run after?

I created pre-test and post-test bash scripts, put them under both .duplicacy-web/repositories/localhost/{0-1}/.duplicacy/scripts/ and chmodā€™ed to 777.
When I run them from shell, they work (echo date to a text file). When I run these jobs 0 and 1 from the Web UI, they donā€™t get executed. In job logs (available in the Web UI) I see these jobs ran from directories .duplicacy-web/repositories/localhost/{0,1} but the scripts arenā€™t mentioned in the logs, so looking at the (missing) script output and backup logs, this doesnā€™t appear to work.

3 Likes

Hereā€™s a wrapper script Iā€™m using with the Web UI on a QNAP NAS to log messages for backup, copy, prune, check, and restore commands to the QNAP Notification Center and to healthchecks.io: DuplicacyLog

A wrapper script is currently easier to hook into the Web UI than pre/post scripts, and has access to the CLI command, output, and exit status.

1 Like

Is the command ā€œinfoā€ the one that do no accept scripts?

when I try to restore data from the web interface, the previous ā€œinfoā€ command that is being run, do not load the pre-info script (required to access to an additional storage through webdav-http)

Running /root/.duplicacy-web/bin/duplicacy_linux_x64_2.7.2 [-log -d info -repository /root/.duplicacy-web/repositories/localhost/all webdav-http://username@localhost:9092/]

Also happens with the CLI version when I run the info option.

I have tried with all possible combinations of:
pre
pre-
pre-info
pre-backup
pre-restore
pre-list
pre-check
pre-cat
pre-diff
pre-history
pre-prune
pre-password
pre-add
pre-set
pre-copy

The info command doesnā€™t support pre/post commands. It is not bound to a repository, unlike other commands, so logically it wonā€™t look for those scripts.

1 Like

I am wondering if could be included that ā€œinfoā€ runs scripts, or maybe changing the ā€œinfoā€ command by ā€œlistā€ (that indeed trigger the scripts) within the web restore option, since clearly the fact that info
do not run scripts removes the possibility of restoring from those setups that require a script to run (using additional third party backend to access other storage for instance, like in my case to access webdav-http storage in jottaCloud).

Keep improving this wonderful software. Itā€™s just amazing the web option indeed :slight_smile:
I have been using this software since I while and I am always impressed by the amount of features you are adding. :+1:

You could use a wrapper script technique similar to that described above: wrapper script:

Move the actual CLI executable from .../.duplicacy-web/bin to another directory and replace it with a custom script that checks the command line arguments for info and, if present, performs your pre-script actions, then execs* the actual CLI.

*On Linux, exec replaces the current process with the new executable, so the wrapper script does not need to handle output and exit status from or signals to the actual CLI. Iā€™m not sure if thereā€™s an exec equivalent for Windows.

2 Likes

Did you ever figure out how to properly use scripts with WebUI? I am looking to set up a very minimal healthchecks.io bash script to run after each successful backup ā€“ to catch instances where the backups fail for some reason.

1 Like

You donā€™t need a post-script for thatā€¦ the Web UI has a ā€˜Send report after completionā€™ checkbox under each backup ID listed in the Backup tab - you can use this directly with healthchecks.io.

1 Like

Good point!

But I would then have to set up an email server (SMTP), which sounds a bit like shooting birds with a cannon. A one-line bash script with a single curl command is what I wanted to run.

The option Iā€™m talking about is the send report under the Backup tab, not the send email under the Schedule tab. Yea, I know, itā€™s not very obvious thereā€™s a difference. :slight_smile:

The former accepts a URL that you can use with healthcheck.io and any schedules running those backup IDs will perform the ping.

image

4 Likes

More solution than you need and in this case unnecessary but I setup a local mailserver on a little Debian VM where each address corresponds to a forwarding script ā€“ itā€™s handy when a particular service doesnā€™t support my preferred notification since most support email.

If Duplicacy didnā€™t support healthcheck.io setup healthcheck@debian.lan to trigger e.g. a python script.

Have smartd send Discord notifications with an email to discord@debian.lan, etc.

It would be good if copy operations could also send healthcheck.io reports as well as backup operations, but I canā€™t seem to find a way to enable that in the WebUI

1 Like