After exercise 12.3, you should have this page open in Adobe Experience Platform:
If you have it open, continue with exercise 12.4.1.
If you don’t have it open, go to Adobe Experience Platform.
In the left menu, go to Sources. You’ll then see the Sources homepage. In the Sources menu, click on Databases.
Select the Google BigQuery Source Connector and click on + Configure.
You’ll then see the Google BigQuery Account selection screen.
Select your account and click Next.
You’ll then see the Add data view.
In the Add data view, select your BigQuery dataset.
You can now see a sample data preview of the Google Analytics data in BigQuery.
You’ll now see this:
You now have to either create a new dataset or select an existing dataset to load the Google Analytics data into. For this exercise, a dataset and schema have already been created. You do not need to create a new schema or dataset.
Select Existing dataset. Open the dropdown menu to select a dataset. Search for the dataset named
Demo System - Event Dataset for BigQuery (Global v1.1) and select it. Click Next.
Scroll down. You now need to map every Source Field from Google Analytics/BigQuery to an XDM Target Field, field by field.
Use the below mapping table for this exercise.
|Source Field||Target Field|
After copying and pasting the above mapping into the Adobe Experience Platform UI, please verify if you don’t see any errors due to typos or leading/trailing spaces.
You now have a Mapping like this one:
The source fields GA_ID and customerID are mapped to an Identifier in this XDM Schema. This will allow you to enrich Google Analytics data (web/app behavior data) with other datasets such as Loyalty or Call Center-data.
You’ll now see the Scheduling tab:
In the Scheduling tab, you are able to define a frequency for the data ingestion process for this Mapping and data.
As you’re using demo data in Google BigQuery that won’t be refreshed, there’s no real need for setting a schedule in this exercise. You do have to select something, and to avoid too many useless data ingestion processes, you need to set the frequency like this:
Important: be sure you activate the Backfill switch.
Last but not least, you must define a delta field.
The delta field is used to schedule the connection and upload only new rows that come into your BigQuery dataset. A delta field is typically always a timestamp column. So for future scheduled data ingestions, only the rows with a new, more recent timestamp will be ingested.
Select timeStamp as the delta field.
You now have this.
In the Dataset flow detail view. you need to name your connection, which will help you to find it later.
Please use this naming convention:
|Dataset flow name||DataFlow - ldap - BigQuery Website Interaction||DataFlow - vangeluw - BigQuery Website Interaction|
|Description||DataFlow - ldap - BigQuery Website Interaction||DataFlow - vangeluw - BigQuery Website Interaction|
You now see a detailed overview of your connection. Make sure everything is correct before you continue, as some settings can’t be changed anymore afterwards, like for instance the XDM mapping.
Setting up the connection may take some time, so don’t worry if you see this:
Once the connection has been created, you’ll see this:
You’re now ready to continue with the next exercise, in which you’ll use Customer Journey Analytics to build powerful visualizations on top of Google Analytics data.