Backup a directory to 2 storage locations with GUI

I have a simple situation; I want to backup my documents folder to my local NAS server and to B2 cloud storage.
The problem is that all the settings are stored in that .duplicacy folder in the source location. So that means when I create a second job with the same source, the first job is completely overwritten.

Am I missing something, is there no way to backup to two different locations?
Preferably I would like to use the GUI since that is what I bought…

Have you looked at the documentation?

I’ve read all the documentation available for the GUI, which is practically non-existent.
Have you read the links you provided, because they don’t answer my question at all.

What I did do as a workaround is to create a “proxy” directory as a repository in the gui as a second job. Then I manually modified the properties file to point the source to where I actually want to backup (ie. where the first job is already setup). In this way I can trick the GUI into backing up the files to a second location on the second job.

This is a terrible hacky and crude way to accomplish something as simple as a two location backup (local/cloud). That should be a native feature, not a work around.

Don’t suggest that I just use the CLI exclusively, I’ve paid for the GUI and I expect a functioning product for my money.

Well, I’m a CLI user, and I have very little experience with the GUI version, but maybe what you are looking for is this option to right click the tab bar:

Since version 2.1.0, Duplicacy supports multiple backup jobs. You can right click the tab bar to activate the job management menu to create new jobs or delete existing jobs. Alternatively, you can also click the Duplicacy icon in the menu bar (for macOS) or right click the Duplicacy icon in the system tray (for Windows) to access the same menu.

Source:Duplicacy User Guide

I didn’t understand, there is a good content on the above linked pages.

You don’t understand the question. I’m not asking how to make a second job, that is obvious.

  • The GUI stores the .duplicacy folder in the repository location.
  • If you make a second job with the same repository location (ie. to backup to a second location) it OVERWRITES the .duplicacy folder, breaking the first job.
1 Like

Not sure if that’s true (as i am not using the GUI).
But i think what you want to do is to create a copy job, which just copies from the first storage to the second. Does that work?

Note that copy is what i use on CLI.

I can assure you that it is true, the second job changes the preferences file stored in the repository, making the first created job have the same destination as the second.

A copy job would be an option, if that was implemented in the GUI.

But, that still wouldn’t be an ideal option. Due to bandwidth considerations I would want to implement tiered backups; local storage every day, remote cloud storage every few days. For scheduling purposes, I have to setup 2 different jobs in the GUI. I’m not even sure if it would be possible for the CLI version…

I think what you want to do is possible in the web gui. It’s still in beta as more and more things are added, but you may want to give that a try and see if it does the job.

Maybe I’m too new around here…where does one get the web GUI?

1 Like

You use the search button, of course :stuck_out_tongue:

Silly me expecting to find in on the main Duplicacy webisite or download page…

I think I’ll wait until the 1.0 release. I came to duplicacy from duplicati to try and get away from all the perpetual beta bugs.

2 Likes

You can start using the command line duplicacy, and then perhaps realize that you don’t actually need UI in the first place. All you need is to configure a bunch of backups and then schedule periodic runs with the task Scheduler on your OS. There is literally nothing else to it.

Few years back I failed to find backup tool (to replace Crashplan) that does not fail at actually doing backups, so while GUI is nice to have it is not a priority, and there turned out to be the only backup tool that works in the first place. This one.

I can blackmouth every single backup tool in existence backed with specific horrific failure examples, including (and especially) duplicati you’ve mentioned, but I don’t think that would be an ethical thing to do on this specific forum.

That said Duplicacy CLI is straightforward and flexible and I personally don’t feel the need for GUI at all. Seriously, can’t think of any reason for it other to avoid spending 5 minutes reading manual about two commands you need to use once.

Note, the GUIs (both current basic one and beta web-GUI) execute command line version to actually perform backups. The command line version did not change, its stable and mature, so no reason to wait.

You should be able to then import configuration into the stable gui when it released but I’m sure you will be just fine with command line since by then it will all just work.

1 Like

Backing up one directory to 2 storage locations isn’t supported by the old GUI, but is possible with the new web-based GUI available from Duplicacy Web Edition 0.2.10 Beta is now available

2 Likes

While I could have written more or less exactly the same post (though you seem to have tried even more backup tools than I), I don’t think the above sentence is fair, or even correct. It’s not a matter of five minutes. And not 50 either. Perhaps five hours, if you resist temptations of experimenting and trying to optimise things (or to understand how duplicacy works). And if you’re either good at scripting or happy with the basic features of duplicacy (e.g. no logs, except pruning logs). I’d still say those five hours are probably worth it, though.

Can you confirm this, @gchen? I was under a different impression but would welcome this, of course.

Thankyou, finally a useful answer.
Three questions:

  1. Is there an ETA on the release of the WebGUI?
  2. Will it support scheduling the local vs remote at different intervals?
  3. Right now I have two separate backup jobs working (1 local, 1 cloud) by using a proxy directory as noted above. Will the webGUI be able to adopt the cloud storage into a single backup job, or will I have to re-upload. It takes ~4 days for my initial backup to the cloud, so I don’t want to have to repeat that later.

The web-ui version is even more closely based on the CLI version than the GUI version was. Basically, the web-ui version is “only” a fancy front-end for telling the CLI version what to do. So, the simple answer is: whatever you can do with the CLI version, you can do with the web-ui version. - Except, of course, that you may have to resort to the command line if the web-ui doesn’t yet support a particular setup. I don’t know what will or will not be supported by the first web-ui release but you might also find some answers in previous discussions about #multi-storage setups and #web-ui.

@gchen, perhaps you could create a matrix with all CLI commands and options, indicating whether they will be available in the first release of the web-ui?

Yes, not only that, but if you set up duplicacy on a different computer, it will also be able to backup to the same storage. Or am I misunderstanding your question?

No, I don’t plan to implement this feature for the first release. Again, since this is of one-time use I would assign a low priority to it.

2 Likes

I think you are misunderstanding the question. Right now I have 2 backup jobs acting on the same source, one to local server one to cloud. To save the week of uploading time already invested, I wanted to know if the “copy” command (either in CLI or webGUI) would re-use all the chunks in the cloud to adopt without having to re-upload the entire job.

That’s exactly what copy does, so you’re safe here.

So if I have a full backup in the cloud, from a different job, copy will recognize that the files are the same and not re-upload everything from the local backup?

I think I’m going to have to test that out. I don’t understand the way chunks are broken up and created, but would two jobs actually create the exact same chunks?