Some thoughts to the WebUI

2 posts were merged into an existing topic: How to pronounce duplicacy?

Thanks for that.

Looks like I’ll give ProcDump a go.

The duplicacy_web.log doesn’t show anything untoward, but the last section of the log when I last saw Duplicacy had crashed, only showed this:

2019/06/11 12:16:23 Schedule Regular next run time: 2019-0611 14:15
2019/06/11 14:15:01 Starting schedule Regular at scheduled time 2019-0611 14:15
2019/06/11 14:15:01 Created log file C:\Users\Droolio/.duplicacy-web/logs/backup-20190611-141501.log
2019/06/11 14:15:01 Running C:\Users\Droolio/.duplicacy-web/bin/duplicacy_win_x64_2.2.1.exe [-log backup -storage NAS -vss -stats]
2019/06/11 14:15:01 Set current working directory to C:\Users\Droolio/.duplicacy-web/repositories/localhost/0
2019/06/11 18:53:38 Duplicacy CLI 2.2.1

So it’s missing the ‘Created log file’, ‘Running’, ‘Set current working directory’ for repo ‘0’, plus the same for repo’s 1, 2 and 3 plus a final ‘Schedule Regular next run time’ - all missing. Seems to be crashing when trying to run the first job in the schedule.

I’ll try out ProcDump.

Instead of having to deal with it would be great if this was a service that Duplicacy offered as part of the yearly license fee. If Duplicacy has not received a backup ping in 1-3 days they would send you an email. It wouldn’t need to support the CLI version since that is free, but it should be supported out of the box for Web.

1 Like

Welcome to the forum!, “2” !
Personally I do not think it would be a good idea for the developer to split his resources to make what is essentially a completely new project.

But it might be might be possible to provide a option to send a report to one of the services that already exist. Some are free for personal use, and some might work better for commercial users.

We have already discussed in other posts.
Another service is, which as the name implies, is designed for Duplicati users, but they will not shun other users. (I use both)

I wanted to check the conditions for using this in my own (Duplicacy) scripts so I took the liberty of contacting the vendor

@gchen, I got this reply from the devolpers of

I don’t know how this fits in your plans for the GUI/CLI, but I imagine it could be possible to use this to provide backup monitoring and some statistics which some users might want.

This would be one way to achieve this without users having to script too much.

(And if CLI could have an option to write a JSON report, CLI users could also use it)


Sent: 22. august 2019 12:42
Subject: About Monitoring service for Duplicati - Will it allow other backup products?


thank you for your email.

1. Do Mittelstandsoptimierer have plans to create a similar site for Duplicacy? (and would it be free for personal use?)

There is no definitive plan but we may do so. Of course we would appreciate if you could help us. Do you know any options in Duplicacy to get the backup reports? HTTP? Mails?
I guess a monitoring service for Duplicacy would also be donation-based.

2. If not, could it be allowed for Duplicacy users to use Duplicati-Monitoring?
(I could possibly write a script to deliver Duplicacy data in Duplicati-like

Yes, you can use our service with any backup software or script that is compatible. We see that some people already use other software and we have no intentions to stop them. For example, somebody seems to be using Arq backup ( with our service.
But of course we cannot guarantuee that our service will keep working with other software that we don’t officially support.

3. If this is something you are considering, I could request the Duplicacy developer to add a feature for sending a http report for Duplicacy much like Duplicati does. (The format would be different but well defined)

That would be perfect, then it would be not a big deal to adjust or service.


If Duplicacy does not implement something that is Duplicati-compatible, I would prefer if it would export the data as JSON over HTTP. What Duplicati does looks like JSON, but is not exactly JSON, and thus is not that parser-friendly. Also, I would prefer if time-values would be UNIX timestamps.



@akvarius I didn’t know can support other tools. This is an interesting option and I’ll definitely look into how to generate those required JSON data.


Hey guys. I am the lead developer of If I can help you anyhow, feel free to ask.



@ChristopherK do you have a spec on what data Duplicacy should export after a backup is finished?

This is what I would propose:

After finishing a backup (or any other operation), the backup software (Duplicacy) sends a POST request over HTTP(S) to a URL that the user can configure in the backup software. This URL contains an ID that the backup monitoring service generated and uses to assign received reports to backup sets configured online. It also contains some secret which authenticates the backup software for posting backup reports for this backup set.

The POST Request contains JSON encoded data about the backup (or other operation) that has just been running.

The JSON is an array or object with the following attributes, just some of them mandatory, all others optional:

  • result (mandatory): Result of the backup run; one of the strings “Failed”, “Success”, “Warning”, “Error”

  • deletedFiles: Number of deleted files since last backup run (int)

  • deletedFolders: Number of deleted folders since last backup run (int)

  • modifiedFiles

  • modifiedFolders

  • examinedFiles

  • openedFiles

  • addedFiles

  • sizeOfModifiedFiles (mandatory): bytes modified (int)

  • sizeOfAddedFiles (mandatory): bytes added (int)

  • sizeOfExaminedFiles (mandatory) : total size of the backup set (on the source system) (int)

  • sizeOfOpenedFiles

  • notProcessedFiles

  • addedFolders

  • tooLargeFiles

  • filesWithError

  • modifiedFolders

  • modifiedSymlinks

  • addedSymlinks

  • deletedSymlinks

  • partialBackup: bool, i.e. “False” or “True”

  • dryrun: bool

  • mainOperation: what this report is about, e.g. “Backup”, “Repair”, “Check”, “Restore”

  • verboseOutput: bool

  • verboseErrors: bool

  • version: Name and version of Backup Software (String)

  • endTime (mandatory): Timestamp when the backup ended

  • beginTime (mandatory): Timestamp when the backup started

  • duration: how long did the backup take? Days.hours:minutes:seconds

  • messages: any kind of info messages

  • warnings: any kind of warning messages

  • errors: any kind of error messages

  • log: any kind of log, e.g. a stack trace in case of error

This is the list of fields we currently store for Duplicati backup reports. If there is anything else you think is important for Duplicacy, feel free to suggest to add fields. Useful things may be:

  • nextBackupTime: timestamp when the next planned backup should run

  • targetBackupSize: size in bytes at the target storage

  • backupVersions: how many different versions of the data does the backup currently hold

Feel free to propose whatever you think makes sense.



Any updates on this? I am also very interested for something to trigger after backup for duplicacy_web.

Any advance to send a POST request via HTTP (S) to a URL after a backup.
Do you plan to implement it soon?
I am very interested to carry out a massive monitoring of all my future clients.
Thank you for being so collaborative.

I plan to release a new version next week that will include this feature.


Thank you
I wait impatiently


Chrome dark mode, btw. Nice?

1 Like

Is the next version still coming this week? Or has the ETA been bumped?

Unfortunately it won’t be available this week. I got stuck in the http reporting for a few days, and then small fixes here and there. I’ll try to wrap it up next week.


Sorry to revive an old post, However i tried to implement your suggestions in a manual JSON request but it did not work.

Does the endpoint already accept pure JSON or is it restricted to the Duplicati report format?


HTTP reporting is documented in this post: New Feature: JSON backup report

I’ve tried to create my own template which is;

“result”: “{{.result}}”,
“sizeOfModifiedFiles”: “0”,
“sizeOfAddedFiles”: {{.new_file_size}},
“sizeOfExaminedFiles”: “0”,
“endTime”: {{.end_time}},
“beginTime”: {{.start_time}}

with just the mandatory fields and it looks to be sent successfully, but on duplicati-monitoring side, it’s still showing result:unknown. Is there anything obvious i’m missing?

2021/07/26 14:20:20 Sending report to {
“result”: “Success”,
“sizeOfModifiedFiles”: “0”,
“sizeOfAddedFiles”: 0,
“sizeOfExaminedFiles”: “0”,
“endTime”: 1627305620,
“beginTime”: 1627305618
2021/07/26 14:20:21 Backup report has been posted to; response status: 201 Created

As anyone implemented Reporting to successfully?
I am migrating from Duplicati to Duplicacy :wink: and liked the daily report mails of my backups with the above service?

I found others asking the same questions in different posts too, so I thought linking them is a good idea: