Change the data source
Learn how to change the data source of a workflow working table using the Change Data Source Activity to flexibly manage data across different data sources such as FDA, FFDA, and local database.
Transcript
Hi! In this video, you will learn when you should change the data source in a workflow, and how to do this using the Change Data Source activity. Adobe Campaign is able to handle a unique data model that is split across multiple data sources, such as the local database, and external databases, such as Snowflake. In most cases, the end user does not need to care where the data is stored, as this happens under the hood. However, in some use cases, you might need to improve the performance of the queries by running a specific query on a specific database engine that is different from the targeting dimension. This is where the Change Data Source activity comes into play. For example, let’s assume you are running a competition for your most important customers where the top 10 winners receive an offer code that they can redeem in your online store. The workflow would be the following. First, we query on recipients for the customers with a VIP status. We then query on those for the top 10 performers in the competition. Once we’ve identified those, we assign them a specific offer code. These offer codes are obtained through an API. We will take the 10 profiles and query the API. We will then receive offer codes and we will loop on the working table to assign each of the 10 recipients a unique code. The offer codes will then be added to the profiles in the database. This will allow us to create a campaign to target the winners and send them an email or push message or any other communication with the offer code. Let’s take a look at what happens in the background. The profiles are on Snowflake, the remote database, and when querying, a working table is created. The population is then limited to 10. Since for this step there is no additional data source involved, we are staying on Snowflake. Then a JavaScript runs and gets the information from an external table via API calls. And the offer codes are assigned one by one to each of the 10 recipients. And because we are not joining data from another database, this activity will run on Snowflake. Lastly, we will update the profiles. There is no need to switch to the local database. Everything is running on the remote database. The issue we have, however, is that on the remote database unitary calls are really slow. It is designed for bulk for big data operations. This is where Snowflake is really powerful, but when assigning the offer codes one by one, it performs really slowly. It is not designed for unitary calls. What we can do is to move this specific activity from the external database to the local database. As the local database works very well for real-time operations and unitary calls. Before we move on, let’s first take a look at the logs. You can see that assigning the logs one by one is taking 2 seconds for each row. So in total 20 seconds to process all recipients. And this is only for 10 recipients. Now imagine you have thousands of recipients. You would have to wait half an hour, an hour or longer for the codes to be assigned. To move the assign offer code activity to the local database, we simply add the change data source activity before this operation. The working table will be copied to the local database so that the operation can run on the local database. But once that has happened, we want to update the profiles and the out-of-the-box behavior will actually kick in again. Because the system will detect that the profile is on another location and will copy the data back to the remote database where the profile data is located. This process works seamlessly. So let’s take a closer look at the workflow. In the second example, we added the change data source activity. It is very easy to configure. Just drag and drop the activity onto the canvas and then you just switch, in this case, to the local database. You can use the default, that is the local database, or you can select any external data source. And let’s take a look at the logs. And you can see the difference. The offer code assignment to all 10 recipients happened under a second. And the copy from one database to another was really fast as well. On the local database, writing unitary calls is really fast. It happens in real time, almost instantly. It’s less than a second. So now you understand when you should move a query activity to another database and how to do this using the change data source activity. Thank you for watching.
See the product documentation for more information on this feature.
recommendation-more-help
7a30e547-2454-4e63-af7a-073312f6c5cb