Some thoughts to the WebUI

This is not as easy as it sounds.

In fact, literally just this second, I checked the Duplicacy web edition icon in the tray next to the clock. Hovered over the icon and it disappeared. Duplicacy, was in fact, not running.

This is an issue I’ve been seeing for the past several weeks but am unable to put a finger on it. My PC is on 24/7 and I leave it logged in. Once a week or two I’ll notice Duplicacy isn’t running. Why? I have no idea! The Windows event logs don’t show any crashes. My last successful backup today was at 12:15 but the 14:15 (and 16:15 & 18:15) backup didn’t run according to the dashboard. Sometimes a day or two will pass before I notice it isn’t running. Luckily, today I only missed a few hours.

Anyway, the point I’m trying to make… Duplicacy itself can’t tell you if something like that goes wrong, if it isn’t even running. :slight_smile:

Hence the purpose of an external service like healthchecks.io. I recently set up an alert for a customer’s system running Vertical Backup (Duplicacy’s brother for ESXi hosts) and set up a pre-script to ping a given URL. If it doesn’t run, I get an email notification. I may have to do the same on my personal machine.

3 Likes

This is not my point. If a program crashes it can not alert the user this is clear. But if the backup medium is missing the user should be alerted. There is an easy solution to this. What do you think of a threshold which can be user defined. This threshold should check for the last successful backup and measure the time which has past since then. If the value is too high it should send a mail.
This would also help if duplicacy crashes entirely. At the next start a mail could be triggered that the last Backup is too old.

1 Like

I like this idea and I definitely think notifications could be more finer grained.

1 Like

I could try to write a script but for that I would need a time stamp format which can be set in relation to the actual time. Any ideas how to do this?

Normally when the web GUI crashes it will leave the stack trace in the ~/.duplicacy-web/logs/duplicacy_web.log. If you can’t find anything there maybe you can try ProcDump? This page has instructions on how to do that for a Windows service but it should work the same way for the web GUI:

2 posts were merged into an existing topic: How to pronounce duplicacy?

Thanks for that.

Looks like I’ll give ProcDump a go.

The duplicacy_web.log doesn’t show anything untoward, but the last section of the log when I last saw Duplicacy had crashed, only showed this:

2019/06/11 12:16:23 Schedule Regular next run time: 2019-0611 14:15
2019/06/11 14:15:01 Starting schedule Regular at scheduled time 2019-0611 14:15
2019/06/11 14:15:01 Created log file C:\Users\Droolio/.duplicacy-web/logs/backup-20190611-141501.log
2019/06/11 14:15:01 Running C:\Users\Droolio/.duplicacy-web/bin/duplicacy_win_x64_2.2.1.exe [-log backup -storage NAS -vss -stats]
2019/06/11 14:15:01 Set current working directory to C:\Users\Droolio/.duplicacy-web/repositories/localhost/0
2019/06/11 18:53:38 Duplicacy CLI 2.2.1

So it’s missing the ‘Created log file’, ‘Running’, ‘Set current working directory’ for repo ‘0’, plus the same for repo’s 1, 2 and 3 plus a final ‘Schedule Regular next run time’ - all missing. Seems to be crashing when trying to run the first job in the schedule.

I’ll try out ProcDump.

Instead of having to deal with https://healthchecks.io it would be great if this was a service that Duplicacy offered as part of the yearly license fee. If Duplicacy has not received a backup ping in 1-3 days they would send you an email. It wouldn’t need to support the CLI version since that is free, but it should be supported out of the box for Web.

1 Like

Welcome to the forum!, “2” !
Personally I do not think it would be a good idea for the developer to split his resources to make what is essentially a completely new project.

But it might be might be possible to provide a option to send a report to one of the services that already exist. Some are free for personal use, and some might work better for commercial users.

We have already discussed https://healthchecks.io in other posts.
Another service is https://www.duplicati-monitoring.com, which as the name implies, is designed for Duplicati users, but they will not shun other users. (I use both)

I wanted to check the conditions for using this in my own (Duplicacy) scripts so I took the liberty of contacting the vendor mittelstandsoptimierer.de

@gchen, I got this reply from the devolpers of https://www.duplicati-monitoring.com

I don’t know how this fits in your plans for the GUI/CLI, but I imagine it could be possible to use this to provide backup monitoring and some statistics which some users might want.

This would be one way to achieve this without users having to script too much.

(And if CLI could have an option to write a JSON report, CLI users could also use it)

akvarius


From: support@mittelstandsoptimierer.de
Sent: 22. august 2019 12:42
Subject: About Monitoring service for Duplicati - Will it allow other backup products?

Hi,

thank you for your email.

1. Do Mittelstandsoptimierer have plans to create a similar site for Duplicacy? (and would it be free for personal use?)

There is no definitive plan but we may do so. Of course we would appreciate if you could help us. Do you know any options in Duplicacy to get the backup reports? HTTP? Mails?
I guess a monitoring service for Duplicacy would also be donation-based.

2. If not, could it be allowed for Duplicacy users to use Duplicati-Monitoring?
(I could possibly write a script to deliver Duplicacy data in Duplicati-like
format)

Yes, you can use our service with any backup software or script that is compatible. We see that some people already use other software and we have no intentions to stop them. For example, somebody seems to be using Arq backup (https://www.arqbackup.com/) with our service.
But of course we cannot guarantuee that our service will keep working with other software that we don’t officially support.

3. If this is something you are considering, I could request the Duplicacy developer to add a feature for sending a http report for Duplicacy much like Duplicati does. (The format would be different but well defined)

That would be perfect, then it would be not a big deal to adjust or service.

[…]

If Duplicacy does not implement something that is Duplicati-compatible, I would prefer if it would export the data as JSON over HTTP. What Duplicati does looks like JSON, but is not exactly JSON, and thus is not that parser-friendly. Also, I would prefer if time-values would be UNIX timestamps.

Greetings,
Christopher

2 Likes

@akvarius I didn’t know duplicati-monitoring.com can support other tools. This is an interesting option and I’ll definitely look into how to generate those required JSON data.

3 Likes

Hey guys. I am the lead developer of duplicati-monitoring.com. If I can help you anyhow, feel free to ask.

Greetings,
Christopher

5 Likes

@ChristopherK do you have a spec on what data Duplicacy should export after a backup is finished?

This is what I would propose:

After finishing a backup (or any other operation), the backup software (Duplicacy) sends a POST request over HTTP(S) to a URL that the user can configure in the backup software. This URL contains an ID that the backup monitoring service generated and uses to assign received reports to backup sets configured online. It also contains some secret which authenticates the backup software for posting backup reports for this backup set.

The POST Request contains JSON encoded data about the backup (or other operation) that has just been running.

The JSON is an array or object with the following attributes, just some of them mandatory, all others optional:

  • result (mandatory): Result of the backup run; one of the strings “Failed”, “Success”, “Warning”, “Error”

  • deletedFiles: Number of deleted files since last backup run (int)

  • deletedFolders: Number of deleted folders since last backup run (int)

  • modifiedFiles

  • modifiedFolders

  • examinedFiles

  • openedFiles

  • addedFiles

  • sizeOfModifiedFiles (mandatory): bytes modified (int)

  • sizeOfAddedFiles (mandatory): bytes added (int)

  • sizeOfExaminedFiles (mandatory) : total size of the backup set (on the source system) (int)

  • sizeOfOpenedFiles

  • notProcessedFiles

  • addedFolders

  • tooLargeFiles

  • filesWithError

  • modifiedFolders

  • modifiedSymlinks

  • addedSymlinks

  • deletedSymlinks

  • partialBackup: bool, i.e. “False” or “True”

  • dryrun: bool

  • mainOperation: what this report is about, e.g. “Backup”, “Repair”, “Check”, “Restore”

  • verboseOutput: bool

  • verboseErrors: bool

  • version: Name and version of Backup Software (String)

  • endTime (mandatory): Timestamp when the backup ended

  • beginTime (mandatory): Timestamp when the backup started

  • duration: how long did the backup take? Days.hours:minutes:seconds

  • messages: any kind of info messages

  • warnings: any kind of warning messages

  • errors: any kind of error messages

  • log: any kind of log, e.g. a stack trace in case of error

This is the list of fields we currently store for Duplicati backup reports. If there is anything else you think is important for Duplicacy, feel free to suggest to add fields. Useful things may be:

  • nextBackupTime: timestamp when the next planned backup should run

  • targetBackupSize: size in bytes at the target storage

  • backupVersions: how many different versions of the data does the backup currently hold

Feel free to propose whatever you think makes sense.

Greetings!
Christopher

4 Likes

Hi!
Any updates on this? I am also very interested for something to trigger after backup for duplicacy_web.

@gchen
Any advance to send a POST request via HTTP (S) to a URL after a backup.
Do you plan to implement it soon?
I am very interested to carry out a massive monitoring of all my future clients.
Thank you for being so collaborative.

I plan to release a new version next week that will include this feature.

3 Likes

Thank you
I wait impatiently

2 Likes

Chrome dark mode, btw. Nice?

1 Like

Is the next version still coming this week? Or has the ETA been bumped?