This tutorial provides steps to create a source connection to bring your Adobe Campaign Managed Cloud Services data to Adobe Experience Platform.
This guide requires a working understanding of the following components of Experience Platform:
In the Platform UI, select Sources from the left navigation to access the Sources workspace. The Catalog screen displays a variety of sources that you can create an account with.
You can select the appropriate category from the catalog on the left-hand side of your screen. You can also use the search bar to narrow down the displayed sources.
Under the Adobe applications category, select Adobe Campaign Managed Cloud Services and then select Add data.
The Select data step appears, providing you with an interface to configure your Adobe Campaign instance, Target mapping, and Schema name.
|Adobe Campaign instance||The name of the Adobe Campaign environment instance that you are using.|
|Target mapping||The technical objects used by Campaign in order to deliver messages, and contain all the technical settings required to send deliveries.|
|Schema name||The name of the schema entity that you are bringing to Platform. Options include Delivery Log and Tracking Log.|
Once you have provided values for your Campaign instance, target mapping, and schema name, the screen updates to display a preview of your schema as well as a sample dataset. When finished, select Next.
The Dataflow detail page allows you to select whether you want to use an existing dataset or configure a new dataset for your dataflow.
To use an existing dataset, select Existing dataset. You can either retrieve an existing dataset using the Advanced search option or by scrolling through the list of existing datasets in the dropdown menu.
With a dataset selected, provide a name for your dataflow and an optional description.
To use a new dataset, select New dataset and then provide an output dataset name and an optional description. Next, select a schema to map to using the Advanced search option or by scrolling through the list of existing schemas in the dropdown menu. When finished, select Next.
You can enable alerts to receive notifications on the status of your dataflow. Select an alert from the list to subscribe and receive notifications on the status of your dataflow. For more information on alerts, see the guide on subscribing to sources alerts using the UI.
When you are finished providing details to your dataflow, select Next.
The Mapping step appears, providing you with an interface to map the source fields from your source schema to their appropriate target XDM fields in the target schema.
Platform provides intelligent recommendations for auto-mapped fields based on the target schema or dataset that you selected. You can manually adjust mapping rules to suit your use cases. Based on your needs, you can choose to map fields directly, or use data prep functions to transform source data to derive computed or calculated values. For comprehensive steps on using the mapper interface and calculated fields, see the Data Prep UI guide.
When mapping your source fields to target XDM fields, you must ensure that you map your designated primary identity field to its appropriate target XDM field.
Once your source data is successfully mapped, select Next.
The Review step appears, allowing you to review your new dataflow before it is created. Details are grouped within the following categories:
Once you have reviewed your dataflow, select Finish and allow some time for the dataflow to be created.
Once your dataflow has been created, you can monitor the data that is being ingested through it to see information on ingested rates and successful and failed batches.
To start viewing your dataset activity, select Dataflows in the sources catalog.
Next, select the target dataset from the list of dataflows that appear.
The dataset activity page appears. From here, you can see information on the performance of your dataflow, including rate of ingestion, successful batches, and failed batches.
This page also provides you with an interface to update the metadata description of your dataflow, enable partial ingestion and error diagnostics, as well as add new data to your dataset.
By following this tutorial, you have successfully created a dataflow to bring your Campaign v8 delivery logs and tracking logs data to Platform. Incoming data can now be used by downstream Platform services such as Real-Time Customer Profile and Data Science Workspace. See the following documents for more details: