Template workflow (Salesforce)

Learn how to configure the source connector for Salesforce CRM using the template workflow. This workflow auto-generates assets needed for ingesting Salesforce data based on templates. It saves you upfront time, and the assets can be customized according to your needs. This workflow is not supported for all CRM source connectors.

Transcript
In this video, I’ll show you how to use templates to auto-generate assets needed for ingesting Salesforce data into Experience Platform. These are the areas I’ll cover. Use the blue chapter markers below the video to advance to or replay these sections over again. Data ingestion is a fundamental step to getting data into Experience Platform so you can use it to build robust customer profiles and use them to provide meaningful experiences. Adobe Experience Platform allows you to ingest data from external sources. These data can be structured, labeled, and enhanced using platform services. The focus of this video is ingesting data from Salesforce, a third-party CRM system. There’s a couple ways to do this, but I’ll demonstrate using templates that auto-generate assets listed on this slide. There are other videos that explain schemas and identities, so if you’re unfamiliar with these topics, review those videos first. I also suggest you review the data ingestion overview video if you haven’t done this yet. Using templates, you reap several benefits as shown here. In earlier versions of this data connector, getting to this step of ingestion and thus to value was very time-consuming. Schemas, identities, datasets, mapping rules, and data flows had to be manually created. The template workflow does all of this for you and you can even customize it afterwards. Let’s get to the demo. I’m logged into Experience Platform. I’ll display sources by selecting the navigation link from the left. Now I’ll select the CRM category to jump to those connectors. Selecting Add Data under Salesforce will kick off the workflow. There are two available paths here. I can accelerate data ingestion by using templates provided by the system to auto-generate all the assets required prior to ingestion. If I already set up the schemas and identities for the Salesforce data I want to ingest, I’d use the second workflow. I’ll proceed with using templates. Now my organization has pre-existing authenticated Salesforce accounts. If this is a first-time setup, start with New Account. Here you’d provide the authentication credentials for your Salesforce account. The fields with an asterisk are required and you’d then choose Connect to Source. I’ll go back to my existing account to show you the remaining workflow. Once I select the account name, I’m presented with a list of templates I can choose to generate my assets based on the account database used with Salesforce. There’s both B2B and B2C types. It looks like some templates for B2B have already been configured. Notice the check boxes for these data tables are grayed out. This means the assets associated with them have been created in the system already. You can open a preview to explore sample data for a template. This is helpful when you want to verify that you’re selecting the correct template for the Salesforce data table you plan to ingest. Don’t worry if this doesn’t map one-to-one with the data coming over. You can modify the mapping in the workflow if needed. I’ll show you this soon. I’m going to select additional templates for the other Salesforce tables I want to ingest data against. I’ll select opportunities and opportunity contract roles. The schedule for data ingestion is configurable. Set it to once to create a one-time ingestion or change the frequency to minute, hour, day, or week to reflect your needs for data availability. I’ll select finish and this is when the magic happens. The system is doing all the work to generate the assets to support this data ingestion including schemas, identities, identity namespaces, datasets, and data flows. This also includes the identity relationships across the multiple schemas. Now this usually takes a minute or so to finish so if you want to go off and do some other things and come back feel free to do that. Once the assets are created this review page displays. This lists the data flows, datasets, schemas, and identity namespaces created or reused for the Salesforce data table selected in the previous step. Some of the assets are reused because they were created from previous template configurations and these same assets are used for this new template configuration. One of the benefits I discussed earlier for using templates is the acceleration to ingestion and the amount of work and time that saves users. I’ll open one of the generated schemas in a new browser tab. This B2B person schema was generated by the system. Now you can see the breadth and depth of the hierarchies used here for the different fields. Now this is only one of the many schemas involved in this data ingestion too. Using the manual workflow and the data connector these would have had to be created prior to setting up the data flow so that’s a lot of time and effort saved. Back in the review template assets screen I want to point out identity namespaces. These have all been generated by the system as well. This is another task that would have had to be in place using the manual workflow. Not only that but the relationships of those identities across the scheme is involved. If you recall I selected the templates for opportunities and opportunity contact roles. Even though the schemas used by these data sets were already generated through previous Salesforce template configuration the data sets are net new. Just a quick high-level call-out the data sets contain the data whereas the schemas validate the format of the data to ensure you’re ingesting quality data into experience platform. Back to data flows there are two key features you can access from here. First we’ll review preview mappings. I’ll select this for the first data flow and it opens in a modal window. This shows me the system generated mappings for the template. On the left side is the Salesforce fields and on the right are the experience platform fields relative to the data set and schema it’s targeting. Use this to review the field mappings. Changing any mappings though happens elsewhere. I’m going to be showing you that next. So as well as updating mappings you can change the draft status for a workflow. If you recall I mentioned earlier that you can make customizations to template generated assets. With data flows you’ll want to make sure the mappings are validated before you set them to active so that batch uploads coming from Salesforce don’t fail. I’ll select update data flow from the three picker for the first data flow. Once the data flow opens I’ll go to the data flow detail screen to point out some updates you can make there. Enable partial ingestion by toggling the setting. This lets you configure an error threshold expressed by the percentage of acceptable errors before the entire batch fails. Toward the bottom you can configure alerts. Alerts allow you to receive notifications on the status of your sources data flow. You can get updates when your data flow has started, is successful, has failed, or didn’t ingest any data. If you want to save these changes and come back to finish the mapping validation select save as draft here otherwise select next. On the mapping step you can update the mappings between the source fields from Salesforce and the target fields and experience platform. At the top it provides a high level status for the number of map fields, required fields, and identity fields. Below that is the detail for the mappings. On the right side are the fields for experience platform and the left has the Salesforce fields. Calculated fields are also automatically generated by the system which is really helpful. When I select next to go to the scheduling step the mapping validation happens. Now there’s an issue with the mapping. The currency ISO code from Salesforce it doesn’t exist in the experience platform schema. If I needed this field from Salesforce I’d go to the schema and I’d add the field there. However I’m going to go back and remove this field from my data flow. I’ll click on the remove icon to the right of the field and just to note I could have done this on the mapping validation error page as well. Now when I select next I no longer have a validation error. On the scheduling step you can modify the schedule for the batch ingestion using the same frequency and calendar options we reviewed when setting up the templates earlier. I’ll select next up here at the top. Now that everything looks good I’ll select finish. This takes me back to all the Salesforce data flows I’ve configured for the account. I’ve confirmed that the B2B opportunity data flow has been updated from draft to active so now I can start receiving data from Salesforce for this data flow. You also have access to data flow features using the 3Picker. You can update the data flow again, disable it if you no longer need the data, or view and monitoring once you begin to receive data sets. You should now feel comfortable configuring data flows using templates from the Salesforce data connector workflow in Experience Platform. Thanks and good luck!

For more information, please see the following documentation:

Previous pageIngest data from cloud storage
Next pageIngest data from databases

Experience Platform