Backup methodology - Have I got this right?

Hello all,

Firstly I wanted to thank you all, whenever I needed information on Duplicacy this forum has been such a fantastic resource. It’s rare you find such consistently good info in a forum so thank you to all of you Duplicacy gurus!

Just a bit of a background on my setup -

My setup is the GUI version of Duplicacy within a docker container on an Unraid server.
That instance of Duplicacy points at a share on the local Unraid server and takes a nightly backup of that data and uploads it to Google Drive, at the end of the week there is a prune command that deletes all snapshots apart from the one taken the day before the prune.

The reason I have done this is that I have a 5TB limit on my Google drive and I want to ensure that as the data changes I don’t hit that cap.

My recent thoughts and fears -

The Data I’m backing up is for my partners business, she is a fairly new business and I need to ensure the data is safe until the projects are complete. We sit close to the 2TB mark of backup data on Google Drive as a constant.

I have been reading a lot on bitrot over the past few days. I am aware that Unraid does not have a way to combat bitrot as it’s using XFS for its filesystem.

My fear is that I’m backing up successfully every day and testing those backups but if I do encounter bitrot and all those precious snapshots are deleted I’ve got nothing “safe” to restore from.

My thoughts on a solution -

There is a Dynamix file integrity plugin for unraid. This will scan and alert if corruption is found. I plan to run this once a week on the share as a scheduled task. This should give me enough time to react and restore data from a known good backup if the worst should happen.

I am then thinking I should change my backup schedule and backup consistently for a month and prune those snapshots once a month. By doing this I feel like I give myself the best chance of reacting to and resolving any issues that may occur.

Things I’m not sure about -

Is this really the best way of doing this?
Is there a more efficient and safe way to ensure I don’t hit the 5TB limit on Google drive and still have copies to restore from in the event of bitrot?

Should I be keeping snapshots from the month just gone as a failsafe in case something happens at that point of prune and I have nothing older to restore from?

If I should be keeping more snapshots how do I avoid the accumulation of data over the hard 5TB cap I have on Google Drive?

Final thoughts -

Thank you for reading this all the way through and sorry if it’s rambly and not clear. I have a tendancy to overthink things but I just want to ensure my partner’s business data is not lost.