Hi,
after spending the last weeks reading about alternatives to my longtime Duplicati backup system I decided on migrating to Duplicacy.
I had a look at a few other solutions
- borgbackup - only supports SSH backends
- restic - I like it but there is / was now webui and
autorestic
did not work as a good CLI alternative to GUI - relica - which is a fork of restic with a very nice GUI, but the author wants to sell it, which does not sound quite promissing
- kopia - very promissing, but the WebUI just âsucksâ, could get better but who knows when
So I am here now and after testing everything I will use and pay for the GUI version. I like the idea to have the opensource CLI version always on hands to restore or even back up
Anyhow, I am moving away from Duplicati because of the same issue I had already multiple times. This lead me to rethink my backup jobs a few times already. I am running Duplicati for almost 5 years now, quite a long time, at the beginning I just it simply for PC backups.
The problem was always the same, for whatever reason, mostly fault connection, reboot, or service crash, the backup was stopped during backing up. This breaks the database, sometimes repairing is possible but sometimes it simply does not work. The same problem I face quite often, now again with a 2TB backup set (out of my 4TB) and restoring the DB from the chunks on the destination takes forever. 2TB with bi-daily backups for more then a year, the DB is around 7GB and the Internet connection not the fastest.
Anyhow, are these things avoidable with Duplicacy. That if I or something else stops the backup, that I can break it. I read a lot in the forum the last days, there are not many threads (compared to Duplicati at least) about broken backups. I also found one How-To how to fix a broken backup by redoing the backup with a different ID so the chunk is back or deleting chunks manually. This sound quite promissing, or at least better then what I experienced with Duplicati.
Secondly, I have split up my backup sets into multiple parts, for some reasons
- I backup my nextcloud and because of multiple users I try to isolate the backup of my friends from the backup of my familiy members.
- Because if a smaller backup set breaks it is not breaking the others (Duplicati style)
- I backup data to a friends home and to B2
- I can be more flexible in timing, because Duplicati has no parallel backup possibility (which I have with Duplicacy)
Anyhow, we are talking about 15 different users which leads to 30 backups when using Duplicacy. Every user gets backed up to a SFTP destination and to a B2 destination.
This is going to create a loooooong list of backups, is this any problem?
Anyhow this is some background, I would be happy about any advice to be honest. How to organizie the backups, the prunes, the checks etc etc.