How to migrate (copy) backups to a new storage using Web UI?

I’m trying to migrate from one storage provider to another (Google Drive to Storj).

How would you accomplish this in the Web UI? We have these original instructions for the CLI:
https://forum.duplicacy.com/t/back-up-to-multiple-storages/1075

But I’m getting stuck with how to translate those CLI commands into the Web UI.

I started down this path but I keep getting tripped up because some of the terminology doesn’t translate precisely from the CLI commands.

I’m pretty sure this is wrong, but this what I have so far:

  1. Have your original storage and backups already up and running (ID: old)
  2. Create a new storage at the new cloud provider (ID: new) and make it copy-compatible with old
  3. Create a second new storage, using the exact same details as #2, with the only difference being that it has a different ID (ID: new_dummy)
  4. Go to Schedule. In the existing schedule for old, add a new job to the schedule: Copy old to new with additional command -bit-identical I don’t think I want this… Won’t this cause a full local download from old to my machine and then repackage it for new ? I don’t want that to happen, per the linked thread above
  5. Run that schedule for old and wait for it to finish
  6. Create a new schedule for new_dummy. Give it a backup job and wait for it to finish.
  7. Create another new schedule and give it a copy job, from old to new with -bit-identical This is the exact same as Step #4, right? This seems obviously wrong

Basically I think I have Step #1-3 correct, but then I get stuck in loops in my head after that.

Anyone able to translate the correct procedure to the Web UI?

This is too complicated.

I would do this:

  1. Copy (sync forward) duplicacy data using any other tool, such as Transmit or Cyberduck or Rclone. This will take a while, but backups continue to original location. You can even do this from a cloud instance, to avoid downloading and uploading data through your connection. Oracle Always-free compute tier provides 10TB of free bandwidth.
  2. Stop backups, and disable schedule.
  3. Do another sync forward to pick up pieces duplicacy added since the copy in step 1 started.
  4. Delete the old storage location in web gui
  5. Add a new storage location in web gui, with the same name.
  6. Re-enable schedule.

Done.

If I understand correctly - I spin up a free tier Oracle compute node, install one of those apps (prob Rclone in this instance) and use it to connect to GDrive and Storj, and then copy the entire duplicacy directory over from GDrive?

Yep, this will transfer the bulk of data, without pumping data through your isp line.

Then the last “catch up” sync, after stopping all backups, can be done from anywhere, it would be small amount of data.