Reinstall duplicacy web, how to restore jobs

I had to reinstall the operating system, and duplicacy web edition. I have the original .duplicacy-web folder. How do I restore all the configurations and jobs without breaking it?

There are a few threads about this but each one recommends something a little different from the other.

One of the threads says to simply recreate the storage and the backup through the web UI making sure to use the same snapshot ID. And then you don’t need the original folder. But then that would mean you also have to recreate the jobs. I don’t remember the settings I used for pruning.

Another thread said after reinstalling, recreate the storage backup, and then copy the contents of the original folder to the new installation.

Another thread mentioned if there was encryption for the web UI, you go into the JSON and delete the encryption field.

What’s the benefit or disadvantage of simply letting duplicacy rebuild the folder for you on the first run after you have recreated the storage and backup, or if you copy the contents of the original folder to the new installation?

Was there another folder that I needed to back up and order to keep the jobs? Something like localappdata%\DuplicacyWebEdition which I don’t have?

Copy the duplicacy.json and settings.json from the old folder to the new. If you want to preserve stats - also copy stats folder. Restart duplicacy. [re]activate it.

You may need to delete and re-add storage locations, use the same names for storage as before, for schedules to continue working.

I’m not sure about benefit but clear disadvantage is needing to configure things again that you have already configured.

Everything is contained in those two json files.

Thank you. Should I also copy over the repositories\localhost from the original folder to the new reinstallation, or let it rebuild itself? Any difference in integrity, speed, or charges for retrieving them from the B2 storage?

I’m wondering if the chunks folder contents are recreated every time you run the backup, or will it reuse the same data if the files reference by the chunk have not changed?

So if the chunks folder does not exist when you run the backup, will the copy of the chunks folder be retrieved from the remote storage, or would they be recreated it locally?

I just let it rebuild those folders and it seems fine now.

When running the backups to the same storage for the first time after the reinstall and without copying over the old cache folders, should I have run each backup one at a time? I ran some of the backups in parallel without waiting for the earlier one to finish first. Or does that not make a difference to risk of integrity?

Local cache is disposable. You can delete it any time. It just helps save egress from the target; but target is always the source of truth.

You can run everything concurrently. One of the benefits of duplicacy. Fully concurrent lockless design.

Thank you for explanation. So just for reference for other people with the same question, what I did was:

  1. Reinstall web edition and run
  2. Create local encryption password, doesn’t need to be the same
  3. Create the storage using the same location/bucket, name, and same encryption password if previously set when originally created
  4. Enter your license (click on trial to get to the dialog box)
  5. Close duplicacy-web
  6. Optionally create symlink
  7. Open the new duplicacy.json and copy over from the old file into the field for repositories and schedules
  8. Open the new settings.json and copy over from the old file anything that you might need, or if the new file doesn’t exist then after running duplicacy web go to the settings tab and click save to create the new file
  9. Run duplicacy web, and run the backups for the first time
  10. Run checks if desired and other schedules

Not sure what happens if you simply pasted the old duplicacy.json file directly into the folder since there are fields with the old encryption keys