[Beta]{class="badge informative"} [Ultimate]{class="badge positive"}
Snowflake Batch connection snowflake-destination
Overview overview
Use this destination to send audience data into dynamic tables in your Snowflake account. Dynamic tables provide access to your data without requiring physical data copies.
Read the following sections to understand how the Snowflake destination works and how data is transferred between Adobe and Snowflake.
How Snowflake data sharing works data-sharing
This destination uses a Snowflake data share, which means that no data is physically exported or transferred to your own Snowflake instance. Instead, Adobe grants you read-only access to a live table hosted within Adobe’s Snowflake environment. You can query this shared table directly from your Snowflake account, but you do not own the table and cannot modify or retain it beyond the specified retention period. Adobe fully manages the lifecycle and structure of the shared table.
The first time after you set up a dataflow from Adobe to your Snowflake account, you are prompted to accept the private listing from Adobe.
Data retention and Time-to-Live (TTL) ttl
All data shared through this integration has a fixed Time-to-Live (TTL) of seven days. Seven days after the last export, the dynamic table automatically expires and becomes inaccessible, regardless of whether the dataflow is still active. If you need to retain the data for longer than seven days, you must copy the contents into a table that you own in your own Snowflake instance before the TTL expires.
Audience update behavior audience-update-behavior
If your audience is evaluated in batch mode, the data in the shared table is refreshed every 24 hours. This means there may be a delay of up to 24 hours between changes in audience membership and when those changes are reflected in the shared table.
Batch data sharing logic batch-data-sharing
When a dataflow runs for an audience for the first time, it performs a backfill and shares all currently qualified profiles. After this initial backfill, the destination provides periodic snapshots of the complete audience membership. Each snapshot replaces the previous data in the shared table, ensuring that you always see the latest complete view of the audience without historical data.
Streaming versus batch data sharing batch-vs-streaming
Experience Platform provides two types of Snowflake destinations: Snowflake Streaming and Snowflake Batch.
While both destinations give you access to your data in Snowflake in a zero-copy manner, there are some recommended best practices in terms of use cases for each connector.
The table below will help you decide which connector to use by outlining the scenarios where each data sharing method is most appropriate.
For more information about streaming data sharing, see the Snowflake Streaming connection documentation.
Use cases use-cases
Batch data sharing is ideal for scenarios where you need a complete snapshot of your audience and real-time updates are not required, such as:
- Analytical workloads: When performing data analysis, reporting, or business intelligence tasks that require a complete view of audience membership
- Machine learning workflows: For training ML models or running predictive analytics that benefit from complete audience snapshots
- Data warehousing: When you need to maintain a current copy of audience data in your own Snowflake instance
- Periodic reporting: For regular business reporting where you need the latest audience state without historical change tracking
- ETL processes: When you need to transform or process audience data in batches
Batch data sharing simplifies data management by providing complete snapshots, eliminating the need to manage incremental updates or merge changes manually.
Supported audiences supported-audiences
This section describes which types of audiences you can export to this destination. The two tables below indicate which audiences this connector supports, by audience origin and profile types included in the audience:
This category includes all audience origins outside of audiences generated through the Segmentation Service. Read about the various audience origins. Some examples include:
- custom upload audiences imported into Experience Platform from CSV files,
- look-alike audiences,
- federated audiences,
- audiences generated in other Experience Platform apps such as Adobe Journey Optimizer,
- and more.
Supported audiences by audience data type:
Export type and frequency export-type-frequency
Refer to the table below for information about the destination export type and frequency.
Connect to the destination connect
To connect to this destination, follow the steps described in the destination configuration tutorial. In the configure destination workflow, fill in the fields listed in the two sections below.
Authenticate to destination authenticate
To authenticate to the destination, select Connect to destination and provide an account name and, optionally, an account description.
Fill in destination details destination-details
To configure details for the destination, fill in the required and optional fields below. An asterisk next to a field in the UI indicates that the field is required.
-
Name: A name by which you will recognize this destination in the future.
-
Description: A description that will help you identify this destination in the future.
-
Snowflake Account ID: Your Snowflake account ID. Use the following Account ID format depending on whether your account is linked to an organization:
- If your account is linked to an organization:
OrganizationName.AccountName
. - If your account is not linked to an organization:
AccountName
.
- If your account is linked to an organization:
-
Account acknowledgment: Toggle on the Snowflake Account ID acknowledgment to confirm that your Account ID is correct and it belongs to you.
_
) in Snowflake. To avoid confusion, do not use any special characters in your destination and sandbox name.Enable alerts enable-alerts
You can enable alerts to receive notifications on the status of the dataflow to your destination. Select an alert from the list to subscribe to receive notifications on the status of your dataflow. For more information on alerts, read the guide on subscribing to destinations alerts using the UI.
When you are finished providing details for your destination connection, select Next.
Activate audiences to this destination activate
-
To activate data, you need the View Destinations, Activate Destinations, View Profiles, and View Segments access control permissions. Read the access control overview or contact your product administrator to obtain the required permissions.
-
To export identities, you need the View Identity Graph access control permission.
Read Activate audience data to batch profile export destinations for instructions on activating audiences to this destination.
Map attributes map
You can export identities and profile attributes to this destination.
You can use the calculated fields control to export and perform operations on arrays.
The target attributes are automatically created in Snowflake using the attribute name that you provide in the Attribute name field.
Exported data / Validate data export exported-data
The data is staged into your Snowflake account via a dynamic table. Check your Snowflake account to verify that the data was exported correctly.
Data structure data-structure
The dynamic table contains the following columns:
- TS: A timestamp column that represents when each row was last updated
- Mapping attributes: Every mapping attribute that you select during the activation workflow is represented as a column header in Snowflake
- Audience membership: Membership to any audience mapped to the dataflow is indicated via an
active
entry in the corresponding cell
Known limitations known-limitations
Multiple merge policies
Audiences with multiple merge policies are not supported in a single dataflow. Different merge policies produce different snapshots, and in practice, data related to one audience would be overwritten by the data from the other audience, instead of data from both being exported as expected.
Data usage and governance data-usage-governance
All Adobe Experience Platform destinations are compliant with data usage policies when handling your data. For detailed information on how Adobe Experience Platform enforces data governance, read the Data Governance overview.