Learn how to configure an external account in Adobe Campaign to import recipient data from Adobe Experience Platform to Campaign. Understand how to create a workflow to upload and target the recipients received from the Experience Platform.
In this video, we’re going to create a workflow to import our Adobe Experience Platform recipient segment data that was uploaded to a campaign destination via an S3 bucket. Additionally, we will send a targeted delivery to our new imported recipients. Let’s get started.
First, in order to pull in our S3 data, we will need to configure an external account in Adobe Campaign. Adding an external account is easy. Under administration, find external accounts. In this instance, external accounts is located under Platform.
Next, we need to add a new external account. Select the new icon followed by providing a label and internal name. Once selected, select AWS S3 from the type dropdown. This will populate a new section asking for your AWS S3 account credentials, including the server, access key, secret key, and region. Upon filling in the required account credentials, remember to save.
Upon saving our AWS S3 account, we are ready to begin a new targeting workflow.
Navigate to profiles and targets, jobs, and select targeting workflows. Then, create a new workflow.
Let’s name this workflow, Adobe Experience Platform Data Import Delivery.
The first component we want to add to our new workflow is the file transfer component located in the events tab.
Drag and drop file transfer onto the workflow followed by double clicking it to open the file transfer pop-over. Optionally, we can give this component a new label. I’ll call mine platform, S3 data download.
Set the action to file download from the dropdown, then tick the checkbox for use an external account. Selecting the folder or drop-down, allows us to select the account we added at the beginning of this video. If you have a server folder within your S3 container, where you saved your data, make sure to provide the folder name. Once complete select okay, to continue.
Now that we are downloading the files from our S3 account, we need to add a data loading component for our CSV file. Within the actions tab, select the data loading file component and add it to the workflow, then double click it to open the data loading pop-over.
The data loading component expects a sample CSV file. After providing a sample of how our data is structured, we can use the preview to confirm the imported data will be in the format we expect. If we recalled the previous video for uploading data from Platform to Campaign via destination, we can see that the fields we exported to our S3 bucket are present in our preview.
Once we’re happy with the preview, double-check that the specified and transition and default target database radio buttons are selected, then, select okay to continue. At this point, if we run our workflow, we should see results equal to the number of exported profiles. After adding a start component, I expect the workflow to return 47 results because my latest CSV file that was uploaded to the S3 bucket has 47 different profiles. I can right click the result and select display the target to preview the output.
Now that we have data loading, we need to map the CSV data fields to campaign recipient fields. To do this, let’s select the targeting tab and drag and drop the enrichment component. Upon double-clicking the component, the add additional data pop over appears. Our primary set should have automatically populated to our data loading file. We are interested in mapping our data, hence, we want to select the reconciliation tab. After ticking the checkbox, we are provided a UI which expects a targeting dimension and reconciliation conditions. Since we want to add recipients, select the recipients targeting dimension, recipients NMS. We can leave use a simple reconciliation keys selected and skip straight to adding our source and destination expressions. Select the add button, and the edit expression button should be made available. Upon selecting it, we can see our data loader fields. We want to pass the first name, last name and email. Let’s start by selecting email.
Next, we want to map this expression to one in the recipient schema. Following the same workflow, we can select email from the available fields in our recipient schema.
After adding all our fields, select okay to continue.
Now, because we mapped our data using the recipient schema as the target, we might see some duplication, if these recipients were already uploaded. As a best practice, we should drop in a deduplication component before uploading our data and sending an email delivery. After adding the component, double click it and select edit configuration.
We only want our enrichment data to be exported. Select the temporary schema radio button followed by selecting enrichment from the schema dropdown. Then, select next to continue.
We need to define the field, we want to filter duplicates on. Add a field by selecting the add button followed by selecting the email expression. We can also rename the expression by giving it a label. Select next to continue.
We can keep the default values of one and choose for me, and then select finish followed by selecting okay to continue.
Now, when I run my workflow, I only receive 47 results. However, as you can see, we have the additional export of a name and email in the targeting dimension column. This means that we can use this targeting dimension in an email directly within the workflow. However, we also want to upload and save all these recipients to a folder. So, let’s select flow control and drag and drop the fork component.
First, let’s upload our data to a folder. To upload our data, we need to drag and drop the update data component from the targeting tab.
Upon opening the update data pop-over, select the insert or update operation type. This means we can override previous recipients with new information and add new ones. Additionally, we want to select the recipient dimension again, as our target. Similar to the enrichment, we need to define our destination and source expression mapping. However, this time we also want to add recipients to a folder. This means we need to create a new folder.
To create a new folder for recipients, select profiles and targets followed by right clicking and hovering over add new folder, database and select recipients. This will create a new recipients folder for us. I’m going to rename my folder to platform profile recipients.
Navigating back to our workflow, we can go back to the update data component and in the fields to update section, type the destination as folder with the source being our new recipients folder. This means that all the recipients data will upload and save to our new recipients folder.
Select okay to continue.
Now that we are updating our data, we can simultaneously send an email delivery to all the recipients that were predicted to order from Luma. Because we used a segment from a machine learning model in Adobe Experience Platform, we already know recipients have a high propensity to purchase an item. Let’s target these recipients with an exclusive discount code email offer.
To do this, start by selecting the actions tab, followed by dragging and dropping the delivery component to our other fork. Upon double-clicking, we can select an email template by selecting the new created from a template option followed by selecting our discount offer template.
If we wanted to customize the email further, we can select the magnifying glass and edit the content. We do not need to specify a target population because our recipients are already specified by the inbound events. The content is specified in the delivery, and if we want the email delivery to auto send, we can select prepare and start. If you wish to sign off before starting the email delivery, you can leave the default prepare, selected. Select okay to continue. With our delivery and update data component complete, all that’s left to do is add the end flow control. Upon running the workflow, two things should happen. First, We should see a list of 47 recipients in our recipients folder. The second thing is an email delivery should run and send our discount offer email to the recipients that were imported.
As we can see here, I got the email delivery offer. You should now know how to import data from an external source and map it to your campaign recipients schema. Additionally, we were able to send an email discount offer via campaign, which will generate delivery logs. This allows us to send these delivery logs back to Platform.
The workflow for exporting our delivery logs back to Adobe Experience Platform will be covered in subsequent videos. Thanks for watching. -