[Integration]{class="badge positive"}

Send audience data to Snowflake data share (batch)

Learn how to configure and use the Snowflake Batch destination in Adobe Real-Time CDP to deliver daily full-refresh audience snapshots into your Snowflake account. See the end-to-end workflow in Adobe Experience Platform, how the data appears as a dynamic table in Snowflake, and the key setup steps and validations to ensure a successful integration.

For more information, review the documentation.

Transcript

Hi, I’m Michelle and in this video I’ll walk you through how to use the Snowflake batch destination in Adobe Realtime CDP. We’ll start with a quick use case, then I’ll show you the end-to-end workflow in Adobe Experience Platform, configuring the Snowflake batch destination, selecting audiences and mappings, and what the activated data looks like inside Snowflake. Let’s begin with the use case. Many customers already use Snowflake as their central data platform. They want to run analytics, modeling, and activation directly in Snowflake without duplicating large datasets into a vendor’s environment. The Snowflake batch destination lets you do exactly that. You can activate RT-CDP audiences into Snowflake on a daily schedule, land those audience snapshots as a dynamic table in your Snowflake account, where your teams can query them with existing tools and pipelines. Use that data for downstream analytics, reverse ETL, or activation workflows in the tools you already rely on.

Architecturally, data flows from Experience Platform into an Adobe Managed Snowflake account, and from there is staged into your Snowflake account via the Snowflake Data Sharing and Dynamic Tables. The result is a zero data copy style experience for your account. Now let’s look at the workflow in Adobe Experience Platform. On the Destinations page, under the Warehouses category, you’ll find the Snowflake batch card. I’ll select Snowflake batch. These are the familiar workflow steps. From here, I’ll configure a new destination. In the first step, I provide a name for this destination so I can easily recognize it later.

Next come the key configuration fields. Snowflake Account ID. This is your Snowflake account identifier. Here you select the exact cloud and region where your Snowflake instance is provisioned, such as AWS, US East, North Virginia, or Oregon. If you’re unsure which region to choose, you can look it up directly in Snowflake. In the Snowflake UI, under your account selector, you’ll see both the cloud provider and region for each account you have access to. Back in Experience Platform, once this account ID and region are set, we also ask that you explicitly acknowledge that this is the correct account. If the account ID or region is wrong, the data share will fail. From here, the workflow is consistent with other real-time CDP destinations. Select Audiences. I choose the audiences I want to export to Snowflake. We support audiences from all RT CDP audience sources, as long as they’re profile-based.

Schedule. I pick how often to refresh the audience in Snowflake. Snowflake batch uses a batch model with daily refresh. Once per day, we export a full snapshot of the audience. On each run, the destination overwrites the previous snapshot in the dynamic table, so what you see in Snowflake is always the latest full membership for that audience. You can choose to schedule at a specific time of day, or to run after the daily segment evaluation completes in real-time CDP. Mapping. I map the profile attributes I care about. Each mapped attribute becomes a column in the Snowflake dynamic table. After mapping, I move to the review step. Once I hit finish, Experience Platform creates the data flow and sets up the Snowflake integration behind the scenes. Let’s switch over to Snowflake to see what this looks like from your data team’s point of view. After the first activation run, in your Snowflake account, you’ll see a private listing from Adobe for the Snowflake batch destination. When you accept that listing, Adobe can create the dynamic table in your account. Once accepted, Adobe’s internal Snowflake environment uses Snowflake data sharing and dynamic tables to populate a table in your database that represents your real-time CDP audience export. In a worksheet, the data typically looks like this. A timestamp column indicating when each row was last updated. A Merge Policy ID column so you can see which merge policy the audience was generated under. One column for each attribute you mapped in the Experience Platform UI. A column representing audience membership status. For example, whether the profile is currently active in that audience. Because this is a daily full refresh, the table always reflects the current full membership from the audiences you’ve activated. Yesterday’s snapshot is replaced by today’s, which simplifies downstream analytics and avoids accumulating stale rows. From here, your teams are free to join this audience table with other fact and dimension tables in Snowflake. Feed it to dashboards, models, or reverse ETL pipelines. Use it alongside other Snowflake-based projects you already have in place. Before we close, let’s recap the critical setup steps. Get the Snowflake Account ID and Region right. Confirm the Account ID and Region with your Snowflake Admin or directly in the Snowflake UI. In the destination configuration, enter that Account ID, choosing the matching region, and check the acknowledgement box. If this is wrong, the share can’t complete and you’ll see errors when we try to activate. Complete the AEP destination workflow end-to-end. Create the Snowflake batch destination. Add at least one audience, configure the daily schedule, and define your mappings. Save and run the first activation. Accept the private listing in Snowflake. In your Snowflake account, a private listing from Adobe appears for the Snowflake batch destination. Accepting that listing is what allows Adobe to create your dynamic table in the target database. Validate the data in Snowflake. After the first run completes, query the dynamic table to check that your map fields appear as columns. Confirm that the timestamp and audience membership values look correct. Once these four steps are done, the destination will refresh itself automatically every day, and your Snowflake users can treat the dynamic table like any other source for analytics or activation. This gives you a simple, governed path from real-time CDP audiences to Snowflake, aligned with Adobe’s broader Zero Data Copy strategy.

recommendation-more-help
9051d869-e959-46c8-8c52-f0759cee3763