Authenticated calls to cloud functions with Python

The past few weeks I developed and deployed a cloud function that is supposed to get called only by authorized users/service accounts and the truth is that the documentation I found wasn’t really helpful.

First I created a service account, gave it roles/cloudfunctions.invoker permission. If you’re dealing with an extracted service account the code is pretty simple

keypath = '.../my_sa_key.json'base_url=''creds=service_account.IDTokenCredentials.from_service_account_file(      keypath, target_audience=base_url)authed_session = AuthorizedSession(creds)# make authenticated request and print the response, status_code
resp = authed_session.get(base_url)

Attempt 1

But in my case, the call to the cloud function is going to happen from within a google service (Dataflow) and I don’t have access to the service account file. So I tried to find how can I use target_audience with the default credentials. According to the documentation we need to generate a token and use it in the headers

import google.auth.transport.requests
import google.oauth2.id_token
def make_authorized_get_request(service_url):
make_authorized_get_request makes a GET request to the specified HTTP endpoint
in service_url (must be a complete URL) by authenticating with the
ID token obtained from the google-auth client library.
req = urllib.request.Request(service_url) auth_req = google.auth.transport.requests.Request()
id_token = google.oauth2.id_token.fetch_id_token(auth_req, service_url)
req.add_header("Authorization", f"Bearer {id_token}")
response = urllib.request.urlopen(req)

Even though that seemed like a good idea since fetch_id_token will try to find the token from various sources. In my case, it didn’t work. Calling that from Dataflow gives megoogle.oauth2' has no attribute ‘id_token’

Attempt 2

So I searched more and another solution came from a post on StackOverflow.

base_url=''IAM_SCOPE = ''
my_credentials, _ = google.auth.default(scopes=[IAM_SCOPE])signer_email = my_credentials.service_account_email
signer = my_credentials.signercreds = google.oauth2.service_account.IDTokenCredentials(signer, signer_email, token_uri=OAUTH_TOKEN_URI, target_audience=base_url)
authed_session = AuthorizedSession(creds)resp = authed_session.get(base_url)

That, too, worked when I ran it on my local dev, but deploying it to Dataflow I was getting more errors. Even though the my_credentials had the correct signer, email etc.. I was getting

Back to Google then…

Attempt 3 (Solution)

I accidentally found this article explaining how to authenticate from service to service on GCP. And even though I don’t like the idea, it worked!

Created a function to get the JWT token, I also used cache-tools to cache the token.

and use the token in the headers

Full Dataflow Code

with beam.Pipeline(options=pipeline_options) as pipeline:
| "Read PubSub Messages" >>,
| .....
| "Write to CF" >> beam.ParDo(WriteToCloudFunction())
  • ** For the caching to work the get_token function needs to be outside the beam.DoFn class

In times like this, I’m super thankful for Stack Overflow and all the people who contribute there!

I hope this article will help people like me who are frustrated with similar issues.

Let me know if you have another/a better way to make authenticated calls on cloud functions and I can add it here!

Data Engineer @ League Inc.