So what about unlimited storage?

Continuing the discussion from Which folders in ProgramData and AppData should be backed up?:

Whaat?

com-resize

So, uhm, you’re talking about this:

image

Right?

So does that mean, I just need to find five buddies to share a G Suite account and :boom: we all got unlimited storage?

well i don’t want to brag or anything buuuuuuut

image

You need to have a “business”/organisation (basically a domain) and to have 5 licensed users for that “business”, which cost 96 euro/license/year (+VAT), as per: Compare G Suite billing plans - G Suite Administrator Help

Unsure how unlimited unlimited is, since i’ve never used more than 7TB (currently @ 3.5), but i’ve not heard people complaining about this, and in my usage (2 years +) i haven’t had any problems with someone from google saying “oi mate, you’re wastin’ my drives”

References:
ACD
MST
and always this

Yes, so to put the full figure on the table, we are talking about roughly 600 EUR per year or 50 EUR per month. I don’t really want to pay more than 10 EUR per month for my backups, hence: five four buddies and myself. (Of course, the G Suite gives you more than just storage, but I’m not sure I’m interested in anything else).

Nice. I wasn’t aware of that option.

But given that all other unlimited storage plans (ondrive, amazon, …) did not survive long (or are very slow, like jottacloud), I wonder how long this one is going to last… Maybe their advantage is that it’s not attractive for individual consumers.

Also nice: you can even choose where your data is stored: G Suite Updates Blog: Choose the regions where your data is stored

I read somewhere that the “catch” is the upload rate. You cannot upload more than the certain amount of data per day. Think it is 750Gb/day, but I’m not 100% sure. Something around that amount though. Which is likely not a problem for an individual user.

The problem would be however actually utilizing that storage effectively for backup purposes; while I haven’t tried duplicacy with google drive per se, most other backup tools, and notably Synology’s HyperBackup, struggle with Google Drive and One Drive, (perhaps due to api limitations enumerating large number of objects). The reliability and speed are not inspiring confidence. For contrast I did not have any issues with e.g. Amazon Drive.

1 Like

Just to chime in with my experience… GSuite Business working excellent for me with Duplicacy and about 1TB data combined. This is directly with the API and no file streaming software.

I backup some repos to my local NAS and then Duplicacy copy that to GSuite, along with several other direct NAS -> GSuite Duplicacy backups to the same storage. The speed is fine (since the NAS does it in the background) and I’m only on 18Mbit/s upstream anyway. Also using 7TB total, with some rcloned stuff on there. Probably shouldn’t say this (first rule of data club n all), but screw it… they may or may not enforce the 5 user requirement. Regards domain name requirement - if you already have a domain name, you can just set up a subdomain for its use. Really just to setup Gmail for the primary user account.

3 Likes

My experience with google API is mostly good: the only bad things i could say is:

  1. the rate limiting hits hard (eg. on a 1Gbps connection)
  2. their API responses are a tad slow (prune deletes 2 items per second)

Beside that, i have had almost no problems with the service, and the support folks were quite fast (quite a few mails and a few phone calls), albeit less helpful in my specific problem.

1 Like

Yea. It’s the same deal with rclone, according to the docs - it’s throttled to less than 2 files per second there as well, which I presume to mean any file operation. (So dunno how multi-threaded prune can help there, as it’s limited on Google’s side and it makes sense to self-throttle to avoid being temporarily locked out.)

What i imagine is that the throttling is per connection.
Since duplicacy has a connection per thread, having multiple threads (hence multiple connections) should help.