Google drive with service account

I am currently backing up my repositories to a local HDD drive. I’d now like to use the duplicacy copy command to copy the snapshots to google drive. However using a duplicacy-owned google app to control access is a total non-starter for me, so I’d like to use service account credentials using my own google project.

I’ve already checked here

I’ve also already checked here

In both of those posts you mention adding the service account email to the share list, but I’m running into the same problem the user from the first post is having - the snapshots get uploaded to the service account google drive, but not mine. Here is what I did to set things up:

  1. Created my own google application and service account by going to the link in the storage-backends article. I can only post 2 links because I’m a new user otherwise I’d post the link.
  2. Downloaded the service account JSON.
  3. In my own google drive account, that is, my personal non-gsuite google drive account, I create a folder named “DuplicacyBackups”.
  4. I share this DuplicacyBackups folder with the service account email, in this case duplicacy@duplicacy-backups.iam.gserviceaccount .com. This works just fine and google finds the service account.
  5. I now init this storage by running “duplicacy add -copy default google-drive testsnapshot gcd://DuplicacyBackups”
  6. I get prompted for the json file and give it the path to the service account file I downloaded earlier. Enter the path and it works fine.
  7. Run the backup using “duplicacy backup -storage google-drive”

The backup runs successfully. If I run duplicacy list -storage google-drive, it finds information and returns snapshot data.

BUT…

There are absolutely zero files in my “DuplicacyBackups” folder on my personal drive. I can restore snpashots from google drive, back up more revisions, it all works, but nothing shows up in my own google drive. After looking around it appears all the backup files got created in the service account’s google drive. Even though I shared my personal “DuplicacyBackups” folder with the service account, it seems the gcd://DuplicacyBackups path did not refer to the folder I created in my personal drive, but rather some folder that got automatically created under the service account’s drive.

Note that google drive recently changed the way they handle shared folders. It was in mid-2020 I believe. If someone shares a folder with me (or a service account), that folder no longer gets added to my drive as a simple folder. It’s now created as a shortcut - it’s nuanced but different from how it used to work.

I’m wondering if anyone has recently used a service account for a non-gsuite/workspace environment? Is it possible these new “shortcut” folder shares have broken duplicacy’s ability to use a service account without having domain-wide delegation that only workspace users can have, or am I just doing something obviously incorrect?

1 Like

I have come to the conclusion that you cannot use a service account to backup to google drive unless you are a g-suite/workspace user who can use the domain-wide delegation feature. The domain-wide delegation feature allows service accounts to impersonate users and put files in the users’ drive, but you can only do it for users within your organization which requires google workspace. This process is described here

If you don’t have workspace and try to use a service account, the files will just get put in the service account’s google drive (only accessible via API, no web GUI) rather than your own. It’s possible sharing a folder with the service account used to work, but I couldn’t get it to. I wrote my own .NET console app to do some testing and couldn’t get it to work there either. It’s possible you can do it, I just don’t want to fiddle with it right now.

Since the duplicacy CLI reads google creds from a JSON file, and I’m already doing some powershell scripting to orchestrate my backup process, I’m thinking of just writing my own code to refresh access tokens via powershell. This would mean I can use the normal oauth flow with my own account, with my own google app, without needing to rely on the duplicacy-owned google app for refreshing tokens. My thought is that every time before the backup process runs, I will read that same JSON file, call google’s APIs to refresh tokens, and then update the file. Then when duplicacy runs it’ll see new tokens and won’t need to refresh anything.

I’m not sure if that will work, I’ll have to experiment. That functionality could be integrated directly into the CLI, i.e. using your own google app and keys, but I’m a .NET guy so it would take me a bit longer to get the code right for Go.

If I get everything working I will write a longer article about my experience and how-tos in case it’s helpful for others

1 Like

I can confirm that sharing the directory with the service account email doesn’t work anymore. Instead, a new directory in the service account’s own drive space is created.

But it looks like the shared directory is still there – it just doesn’t have the root directory as the parent so when Duplicacy lists the root directory no shared directories are returned.

I’ll make a simple tweak to fix it.

1 Like

That would be awesome, thank you! Now I wish I would have fiddled a bit longer to find a solution instead of making you do the work! I appreciate the help

Here is the fix: Find the storage path in shared folders first when connecting to Goog… · gilbertchen/duplicacy@e43e848 · GitHub

Hmm I grabbed the master branch, re-built the exe, and repeated my steps and I still don’t see any files in the shared folder I created. To be fair I’ve never built a go project before and I did it using an Azure DevOps pipeline so I didn’t need to install locally. That being said the pipeline outputs the source code used during the build and these new changes were included there. Plus the entire backup process did complete successfully so I think the build was fine, I just still don’t see anything in the shared folder.

I’m happy to provide more details to reproduce, like the service account JSON file and screenshots of my share settings, I’m not sure if that would be useful?

BTW I did find a different workaround for this. The JSON token generated from Google Drive for Duplicacy doesn’t include the client secret and the end_point parameters point to a duplicacy.com address. However your code is setting the client secret anyway. I found that I can create my own google app with a desktop client type instead of web application, update JSON token with client id and secret from that desktop client, set the end_point URLs to google’s well-known oauth endpoints, and use a powershell script to generate my own initial refresh token. Everything seems to be working doing it that way. This allows me to use my own google cloud app without having any external dependencies to refresh tokens. I’d still prefer a service account but this works as a good compromise.

You can add a log line to print the id of the storage path in src/duplicacy_gcdstorage.go like this:

	if isServiceAccount && !strings.Contains(storagePath, "/") {
		storagePathID, err = storage.findSharedFolder(0, storagePath)
		if err != nil {
			LOG_WARN("GCD_STORAGE", "Failed to check if %s is a shared folder: %v", storagePath, err)
		}
        // Print the id for debugging 
        LOG_INFO("debug", "storage path id: %s", storagePathID)
	}

Then go to the Google Drive website and check the id in the URL to see if they match.

Thanks for that tip! I finally got my azure devops build pipeline running - I just created a how-to post about it in case people were interested - so I should be able to test a new build really easily. I will try that and let you know what happens!

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.