Reports API Integration with Google Data Studio
This tutorial will walk you through how to integrate and use the Broadsign Ads Reports API with Google Data Studio.
The goal of this tutorial is to visualize a daily report in Google Data Studio. If you are using another reporting service, the steps will be similar.
This guide will be written in Node.js, but you are free to use any language that you like. The file structure will be the following:
reportsApi.js
getLatestExport.js
getCreds.js
updateFile.js
index.js
.env
package.json
your_keyfile.json
Prerequisites
Google Cloud
To automate this workflow, we will be using the Google Cloud suite of products in combination with the Broadsign Ads Reports API.
You must first create an account in Google Cloud and enable billing. You must then enable the following services:
- Cloud Storage – This is where we will save the CSV files.
- Cloud Functions – This will be used to execute functions on the cloud.
- Cloud Scheduler – This will allow us to run our function on a daily basis.
If you need information on how to create an account and enable the above services, refer to the Google Cloud documentation.
Data Studio
Data Studio has a built-in integration with Cloud Storage. Therefore, once we link the bucket as data source, it will keep ingesting the new files into your Data Studio report.
Scheduled Report
Before we start writing code, you must create a scheduled report in Broadsign Ads, as described in Create a Scheduled Report via the UI.
Reports API Access
Make sure that the API access is enabled (as described in Enable the Reporting API), or contact your account manager to get that done.
The following example shows an integration with Reports API:
// reportsApi.js
const https = require('follow-redirects').https;
const options = {
hostname: 'api.buy.broadsign.com',
headers: {
'Content-Type': 'application/json',
}
};
exports.getAuthenticationToken = async (body) => {
const creds = JSON.stringify({
Username: body.username,
Password: body.password
})
options.method = "POST"
options.path = "/v1/auth"
const res = await getApiResponse(creds)
return res?.token
}
exports.getOrganizations = async (token) => {
options.method = "GET"
options.headers.Authorization = token
options.path = "/v1/organizations"
return await getApiResponse()
}
exports.getReports = async (token, organizationId) => {
options.method = "GET"
options.headers.Authorization = token
options.path = "/v1/organizations/" + organizationId + "/reports"
return await getApiResponse()
}
exports.getReportExports = async (token, organizationId, reportId) => {
options.method = "GET"
options.headers.Authorization = token
options.path = "/v1/organizations/" + organizationId + "/reports/" + reportId + "/exports"
const res = await getApiResponse()
return res?.exports
}
exports.downloadReportExport = async (token, organizationId, reportId, exportId) => {
options.method = "GET"
options.headers.Authorization = token
options.path = "/v1/organizations/" + organizationId + "/reports/" + reportId + "/exports/" + exportId + "/file"
return await getApiResponse(undefined, false)
}
getApiResponse = (body, shouldParse = true) => {
return new Promise((resolve, reject) => {
const req = https.request(options, (res) => {
let data = '';
// console.log('Status Code ', res.statusCode + " for endpoint " + options.path);
if (res.statusCode === 404) {
reject(res.statusCode + " not found");
}
res.on('data', (chunk) => {
data += chunk;
});
res.on('end', () => {
try {
const response = shouldParse ? JSON.parse(data) : data
resolve(response)
} catch (err) {
reject(err)
}
});
}).on("error", (err) => {
reject(err);
});
if (body) {
req.write(body);
}
req.end();
})
}
Once you have the integration, you can build a small function to get the .csv file of the last export. For this tutorial, we will assume that you know your organization ID and report ID, but you can get the list of your organizations and reports with the API (see List Organizations and List Reports).
First, we need to generate a token using the /auth
endpoint, then use this token to get the list of exports and the .csv file for the last export.
// getLatestExport.js
const {
getAuthenticationToken,
getReportExports,
downloadReportExport
} = require("./reportsApi")
const { getCreds } = require("./getCreds")
exports.getLatestExport = async (organizationId, reportId) => {
const token = await getAuthenticationToken(await getCreds())
if (token) {
const exports = await getReportExports(token, organizationId, reportId)
if (exports.length) {
const lastExport = exports[exports.length - 1]
const csv = await downloadReportExport(token, organizationId, reportId, lastExport.id)
return { csv, name: 'latest_export' }
} else return null
} else {
throw new Error("Error getting authorization token")
}
}
The getCreds
function retrieves the username and password of the user who created the report. Whether these credentials are stored in environment variables or fetched from a service such as Google Secret Manager is for you to decide. We will use environment variables in this tutorial.
// getCreds.js
require('dotenv').config()
const creds = {
username: process.env.USER_EMAIL,
password: process.env.USER_PASSWORD
}
exports.getCreds = async () => {
return new Promise(resolve => {
resolve(creds)
})
}
This function will take the passed .csv file and save it in your bucket.
// updateFile.js
const path = require('path')
const fs = require('fs')
const os = require('os')
let tempDir = os.tmpdir()
const {Storage} = require('@google-cloud/storage')
exports.uploadFile = (csv, name) => {
const storage = new Storage({
projectId: 'your_project_id',
keyFilename: path.join(__dirname, './your_keyfile.json')
})
const bucketName = 'your_bucket_name'
const options = {
destination: `${name}.csv`,
preconditionOpts: { ifGenerationMatch: 0 }
}
const filePath = `${tempDir}/${name}.csv`
return fs.writeFile(filePath, csv, 'utf8', async (err) => {
if (err) console.log(err)
else return await storage.bucket(bucketName).upload(filePath, options)
})
}
We now put everything together. Cloud functions are defined as follows:
// index.js
const {getLatestExport} = require('./getLatestExport')
const {uploadFile} = require('./uploadFile')
exports.saveLatestExport = async (req, res) => {
try {
const organizationId = parseInt(req.query?.organizationId)
const reportId = parseInt(req.query?.reportId)
if (!organizationId) throw new Error('no organization ID passed')
if (!reportId) throw new Error('no report ID passed')
const { csv, name } = await getLatestExport(organizationId, reportId)
await uploadFile(csv, name)
res.status(200)
} catch (e) {
console.error(e)
}
}
You will need the NPM package @google-cloud/functions-framework
to test your function locally. We recommend to have the follow script in your package.json
:
"scripts": {
"start": "npx functions-framework --target=saveLatestExport --port=5000"
},
Running npm start
from the root of your project will give a similar screen to the following:
Once you have tested your function, you can deploy it by using the following command:
gcloud functions deploy saveLatestExport --entry-point saveLatestExport --runtime nodejs16 --trigger-http --project <project-name>
You can also pass environment variables in the command above by adding --set-env-vars FOO=bar,BAZ=boo
. For more information, see Using Environment Variables.
You will be able to see your function in the Cloud Functions dashboard. By clicking on it, you will see all the details of functions from source code and trigger to metrics and logs.
The last step is to schedule this function.
To do schedule the function:
- Locate the trigger URL of your function:
- Go to Cloud Scheduler to create your daily job.
- Click CREATE.
You will be taken to the dashboard of jobs and see the new one you just created
Notice that we are passing organizationId
and reportId
as parameters. This can be declared inside the function, dynamically deduced in the code, or predefined in environment variables. You can implement it as you see fit to meet your given business requirements. Once this step is completed, the function will be executed every day at 7h00 am EST and add the latest report in your Cloud Storage bucket.
You can now import data in Data Studio to visualize it.
To import data in Data Studio:
- Open Data Studio, click Create, then select Data source.
- Choose Google Cloud Storage.
- Write the path to the folder containing all the .csv files in your bucket or the bucket name if the files are at the root level.
- Click CONNECT in the top right corner.
You will be presented with a list of options.
You should see the following:
Make sure to check Use all files in path.
You can then use and display your data the way you want.