First I created a service account, gave it
roles/cloudfunctions.invoker permission. If you’re dealing with an extracted service account the code is pretty simple
from google.oauth2 import service_account
from google.auth.transport.requests import AuthorizedSessionkeypath = '.../my_sa_key.json'base_url='https://us-central1-my_project.cloudfunctions.net/my_function'creds=service_account.IDTokenCredentials.from_service_account_file( keypath, target_audience=base_url)authed_session =…
In my previous post, I explained how to stream data from Salesforce to PubSub in real-time. The next logical step would be to store the data somewhere, right?
One option could be, for example, to batch the data, and write them to files in GCS. That’s a good start and…
Ok, so, we’ve written our Dataflow Template with Python, now what? We want to schedule it to run daily and we’re going to use Airflow for that.
The first thing we want, for security reasons, is to keep service accounts separate. In the previous post, we’ve created a service account…
In this article, we will try to transform a JSON file into a CSV file using dataflow and python
First, we’ll need a service account, give it the “Dataflow Worker” role and don’t forget to export it as a JSON at the end so we can use it later.