Pre Command and Post Command Scripts

I would like to fail in a pre-backup script if the backup is about to add > x MB or Yk files in a new snapshot. This is something I have to do in order to tweak and optimize the backup filters over time. Because space cost is not zero on cloud storages. Has anyone had any success checking in an automated fashion what’s about to be added to the backup snapshots, either in terms of raw size or file count? Thanks

It sounds very counterproductive an idea: to discard data to fit into a specific storage cost.

Data either needs to backed up or it does not. Cost of cloud storage is an external factor that has no bearing on the value of the data.

In the grand scheme of things, in the context of data backup, cost of storage is always negligible compared to the cost of data it safeguards

This is not true. Data that might be worth storing at 0.01$/GB might not be worth storing at $100/GB. One size does not fit all.

2 Likes

It’s a straw man argument here.

Both numbers you quoted are way too high for backup applications. In real world we live in cost of backup storage is somewhere between $1 and $4 per TB per month. Any amount of data majority of users have will therefore be way under $10. Splitting hairs to save couple of bucks a month is highly, outrageously counterproductive, even at minimum wage.

If you are saying “but there may be some users who can’t decide whether they need to backup those 100TB they have laying around” — those users are a minority and they already know what to do, they definitely would not be asking for advice on data management on duplicacy forum.

And yes, one size definitely fits the vast majority. And outliers usually don’t need forum advice: either they store very little or petabytes; on this entirely different scale everything changes.

1 Like

To answer the question (or point you in the right direction at least)… you can’t do this directly with Duplicacy, however you could probably run a diff or backup --dry-run and parse the output.

You might not be able to cancel the job with a Duplicacy managed pre-script, but you could probably write a standalone script using these commands and abort the final run if it exceeds a threshold.

4 posts were split to a new topic: Low cost / Archival storage discussion

How do you do this for backups in the WebUI?

I use this CLI wrapper script.

Thanks. Is there any equivalent for vanilla linux or even unraid?

Hi, please can you add this feature for Duplicacy Web 1.8.0 ? It will be wonderfull.

Thanks.

3 Likes

This would be great for me too as I have remote storage.

Can we get an idea when this feature is planned to be added?

1 Like

Hi, this will send the ping when the backup is complete, regardless if it completed with warnings or not, correct?

Healthcheck.io is designed to notify you when it doesn’t get pinged (e.g. computer is offline) or if the job fails.

Yes, my question was about when Duplicacy sees a job as failed. From other discussions looks like if backup has warnings it’s still flagged as successfull with exit code 0. So we receive a notification that everything is fine, until you get a closer look to the logs.

Yeah to overcome this, I had to use the job scheduler to send emails to health checks and then parse the emails for failure or success

6 years later and still no support for script configuration in the GUI.
I guess it’s safe to say it’s never coming

2 Likes

Same issue here. I’ve tried various different things from placing them in /cache/localhost/all/scripts, to placing them in every individual /cache/localhost/{0,4}/scripts folder. They are called “pre-backup” and “post-backup” with no extension and have chmod +x applied to them. Yet in the backup logs I only see this:

2025-02-07 11:36:29.567 INFO SNAPSHOT_FILTER Parsing filter file /cache/localhost/1/.duplicacy/filters
2025-02-07 11:36:29.567 INFO SNAPSHOT_FILTER Loaded 0 include/exclude pattern(s)

So it’s 100% looking in the correct directory but it’s simply not looking for, nor executing the scripts.
This is actually a pretty big dealbreaker. I’ve moved away from Duplicati but there the pre- and post-scripts did work from the UI…

Edit, OK it seems that I have figured it out:

The exact path is cache/localhost/{id}/.duplicacy/scripts, where {id} is the ID of the backup: 0, 1, 2,…
The names of the scripts are pre-backup and post-backup, without extension. And obviously should have chmod +x.

These scripts need to be in this directory for each and every backup. Putting them in cache/localhost/all/.duplicacy/scripts does NOT work (which I consider to be a bug). This approach works for the WebUI version.

It’s not a bug - all is where non-backup operations run from (prune, check etc.) - numbered directories are where the backups run for each repository, the index of which can be found in duplicacy.json.

1 Like

Still no support for pre/post backup scripts in the web gui? I almost purchased the license but I guess I’ll just have to use the free CLI version given how much gymnastics is required to setup the pre/post scripts when using the gui.

It’s only free for personal use. See Duplicacy