Use of rsync/clone for the first copy of a backup

Hi,

I have been running duplicacy for a while with GDrive as a storage. I now added a new remotely located SFTP storage and would like to use it as a copy. I understand that the normal procedure would be to use the copy functionality for that, but that not is not ideal in my case: my upload is quite limited and using the copy means going through the upload bottleneck again. Ideally, I would like to have a direct transfer from GDrive to the SFTP storage using Rclone, without going through my home network. Is that possible?

I created both storages from the web GUI.

You can use rclone to copy from GDrive to SFTP. That should give you two identical storages. On the other hand, with a Duplicacy copy job you can encrypt both storages with different passwords, or encrypt one while leave the other unencrypted.

You won’t be able to copy Google Drive directly to SFTP without using your own internet connection. Rclone will do it, but i’ll be transferring the data down and then up. Same with Duplicacy’s copy command.

Your only real alternative would be to employ another intermediary computer - say, a VPS or even a (Google) cloud server - to do the copying.

(Honestly, I haven’t looked too deep into the cloud server options but I’ve heard people using free tier Google servers if interfacing only with Google storage. Do some research if you wanna explore this route coz getting it wrong could mean massive unexpected bills for bandwidth usage. :slight_smile: )

Personally, I’d always advise keep a local backup at least, so perhaps scrap the SFTP and buy a sufficiently large external HDD(?). Then you could backup to HDD -> copy to Google Drive. Or you could go the cheap NAS route.

Thanks, I started doing the transfer. I can do the transfer directly from Google Drive to the SFTP, since the SFTP is just a Raspberry Pi with a hard drive at another location.

I decided not to have a local backup since what I am backing up is my NAS so I am trading cost vs speed of recovery. A partial restore would not be a problem, and I defined my backups depending on how urgently I would need to recover the data. Even the less urgent would take roughly a week to download, or a day if I were to travel and fetch the Pi from where it is. I just decided that since my data is on a raid6, the risk of having to do a full restore is really low.

Ah yes in that case, with remote access to the Pi, you just pull the data down from Google Drive - as you seem to be doing…

Another way to speed things up would be to ‘pre-populate’ the data attached to the Pi, by bringing the Pi to your local computer, performing backups to the Pi’s (now-copy-compatible) storage using a dummy backup ID, return the Pi and then continue to copy from Google Drive to the Pi.