Understand the Adobe Experience Platform Data Connector

This capability is in beta, and subject to frequent updates and modifications without notice.
Please reach out to Adobe Customer Support if you plan to implement this capability.


Adobe Experience Platform Data Connector helps existing customers to make their data available on Adobe Experience Platform by mapping XTK data (data ingested in Adobe Campaign) to Experience Data Model (XDM) data on Adobe Experience Platform.

The connector is uni-directional and sends the data from Adobe Campaign Standard to Adobe Experience Platform. The data is never sent from the Adobe Experience Platform to Adobe Campaign Standard.

Adobe Experience Platform Data Connector is intended for data engineers who understand Adobe Campaign Standard custom resources and have an understanding of how customer’s overall data schema should be inside Adobe Experience Platform.

In this tutorial we will see what other out of the box mappings which are provided than the Campaign standard systems you have been bringing from Adobe. We are gonna talk about ACS Data Service tool, which is accessible from Administration, Development, and this icon called Platform, and this icon called Data Mappings. So ACS Data Services is a tool or a feature to help you ingest ACS data to AEP, so that you can run your use-cases on AEP. You can ingest all your tables, custom resources, all your experience events are even generated, like opens, clicks, et cetera, or any profile related information to AEP. From a personal perspective this slightly technical tool and we expect someone well versed in understanding your data to do this. So for example a partner, or somebody who has basically modeled your data before and ingested that in ACS would be the right person to understand your data, what are the relationships, what is the ER diagram, how the data is gonna fit with each other in different tables and how that should be flattened and sent to AEP, or Adobe Experience Cloud Platform. So what you see in the screen here is basically the page which we land when I click on Platform data mappings. These are the two default mappings which are created by default. I am at Profile and Platform Data Model Mapping, so let me open this one, this is the by default mapping for our profile targeting dimension AEP. So what you see here is that we can change the label of the table, we can change the ID if you want to include or use this as a template to create your own mapping of the tables. So in that case what we recommend is, you go to Data Mappings back again and select this and then replicate it. Since we are no replicating it and I am specifically talking about out of the box data mapping, we will go back again to know what are the different parts, what are the different things which we need to take care of when we are creating mapping. So whenever you are creating any mapping, make sure that you add the right label with which you can identify the mapping. We choose the right targeted dimension which can be Profile in this case, or any of the targeted dimensions. It can be a custom resource in case it is needed. You choose the ID which is again and in this time the build name, or by default we generate an ID as well, and then you choose the right dataset. So in this case the dataset is Profile_Campaign, you can choose to send all your data from Campaign to AE in this particular dataset itself, it is the by default dataset which is created by Campaign in AEP, so when you go to Platform instance you can search for Profile_Campaign as the dataset and then you can see how your data is flowing in from ACS to AEP in that dataset. So you can continue to use this one, or you can choose this particular explorer option for looking at different platform datasets. So this particular UI will give you all the different datasets which are created in Platform in AEP, under this particular and you can choose the one which suits you best. So I will let it remain as Profile_Campaign, then you see the section Publication. Publication section is basically for when then the mapping is published it states you what is the stages, whether the mapping is 100% published or not, so Publication status is for that. Since right now this is in a draft state, this particular mapping has not been published at all so far. At the end of this particular tutorial we will publish it and see how this Publication status changes. Also you see these three dots towards the right bottom corner, and if you click on these three dots you can look at mapping job logs. So these logs can help you understand why a mapping has failed, or if your mapping is successful or not. In case it is fails then these logs also provide some error definitions or some error understanding as to what might have happened and what might be incorrect in the mappings, so that you can go and correct it and then you can work on your mappings. Similarly Export Jobs which tells you the status of each job, how many records were sent, how many counts of records were there, what were the time of that particular batch. Every logged information for that particular batch can be seen here. Each mapping when we should create here runs every 15 minutes after it is published for the first time. The next section which you see is Audit Fields. So Audit Fields help us understand, how do we know in a particular custom resource whether a particular record has changed or not. So in the case of Profile of the targeted dimension, the last modified value, or the last modified column is automatically mapped because we know that this would be updated whenever there is a change in the record. But in case you have created your own custom resource, and there is no field to tell us that what is that particular column, then you need to create a field and update it whenever there is a profile or a record changed. So whenever that record changes you can add that as part of the audit field. This is the value for incremental sync, so that we will know, say for example you synced at 11:00 a.m. and then at 11:15 a.m., looking at this particular value we will know how many records changed in this particular data in that time period of 15 minutes.
The next one is Field mappings, so what you see already mapped, the fields in the profile table itself, so we have these fields like birthday, blacklist, email, fax, first name, last name et cetera. These are out of the box fields given in ACS profile table. These are mapped automatically to the dataset at the right XDM field part, and if you have extended the profile table to add more fields you can simply click on “Create new field mapping” and add those fields. If not then you can simply publish it as is. In case you are adding “Create new field mapping” you can look at the tutorial for adding custom resources and then you will know how to use this particular option to add more field mappings.
And the last section is Identity Namespaces. It helps us identify what other identities in this particular table in which you are sending, so in this case Profile ID is the identity so we are sending it out of the box, but in case you have more namespaces you can create those namespaces and you can add more namespaces here. The namespace is essentially any identity which basically helps a record be identified uniquely, so all the namespaces can be added and sent here in this section. Let me quickly publish this. When you go to Publish you also see there is a dialog which comes up which has a warning, as there is some information on whether you want to resend the records from the beginning. Warning is specifically for adding new labels to all the XDM fields in Adobe Experience Platform. So Adobe Experience Platform provides a specific feature of build which offers data control, it helps you identify whether that data should be used, what are the purposed for which the data can be used, so we highly, highly recommend that you add these new labels to all the XDM fields in Adobe Experience Platform, so that when you are running your use-cases on the AEP, the data knows where it can be used for. There is no place to add these new labels from Campaign, but you should go to your platform UI and add these new labels there. The second option for which this dialog is actually for is basically, do you want to resend the records from the beginning? When you are sending your records for the first time this option does not matter. Whether you say yes or no, whenever the last time-stamp is since the time data has changed, all those records will be ingested in any case. But if you have already ingested your data and now you want to be sending these records, so if you click on “Yes” then essentially all the profiles, all the information in your ACS database would be resent again. So in case you have 10,000,000 profiles, all those 10,000,000 profiles would be sent again. But in case you choose “No” then only those profiles will be sent which have experienced a change in the time-stamp. In case you have changed this email completely, or in case they’ve added email so that you know that for this particular field there was no data in the beginning but now you have to send that data, then it might be a good option to choose “Yes” here and resend all your records from the beginning. If not then choose “No” otherwise it will impact your performance unnecessarily, because ingesting your entire database will take some time. So be careful of what you choose here, by default we would suggest that you keep it to “No.” Since I’m ingesting for the first time let me choose “Yes” and say “Okay.” So right now a publication is in progress, so what you see in the background is this particular Publication section. You see that the progress bar has reached 100% and the mapping is published, see this is there? I go back on these three dots in the bottom right corner and you see here Mapping Job Logs. So in the beginning this particular space was empty, but now we see that some validations and mandatory fields et cetera have been checked and the mapping was successful.
So these are the fields for out of the box data mapping, you can always go and see the status of data export for all your mappings from this particular option, this option here. If you want to look at your datasets in AEP, then you can use this particular button to access that from datasets, to add, modify schema on Platform, or just simply plain look at things. You can publish, unpublish the mapping at any time, you can stop the mapping. We have covered the function of these various buttons in a different tutorial and hopefully that particular information will be useful to you.

This video gives an overview over the Adobe Experience Platform Data Connector (09:35 min)

The out-of-the-box transfer of subscription events is not supported. To transfer subscription events, you can create corresponding XDM and dataset on Adobe Experience Platform, then configure a custom data mapping for these data.
Existing experience events cannot be ingested into Adobe Experience Platform, but ongoing generated experience events are streamed to Adobe Experience Platform.

Key steps to perform a data mapping

The following tutorials describe the key steps to perform a data mapping between Campaign Standard and Adobe Experience Platform: