[Beta]{class="badge informative"} [Ultimate]{class="badge positive"}

Snowflake Batch connection snowflake-destination

IMPORTANT
This destination connector is in beta and only available to Real-Time CDP Ultimate customers. The functionality and documentation is subject to change.

Overview overview

Use this destination to send audience data into dynamic tables in your Snowflake account. Dynamic tables provide access to your data without requiring physical data copies.

Read the following sections to understand how the Snowflake destination works and how data is transferred between Adobe and Snowflake.

How Snowflake data sharing works data-sharing

This destination uses a Snowflake data share, which means that no data is physically exported or transferred to your own Snowflake instance. Instead, Adobe grants you read-only access to a live table hosted within Adobe’s Snowflake environment. You can query this shared table directly from your Snowflake account, but you do not own the table and cannot modify or retain it beyond the specified retention period. Adobe fully manages the lifecycle and structure of the shared table.

The first time after you set up a dataflow from Adobe to your Snowflake account, you are prompted to accept the private listing from Adobe.

Screenshot showing the Snowflake private listing acceptance screen

Data retention and Time-to-Live (TTL) ttl

All data shared through this integration has a fixed Time-to-Live (TTL) of seven days. Seven days after the last export, the dynamic table automatically expires and becomes inaccessible, regardless of whether the dataflow is still active. If you need to retain the data for longer than seven days, you must copy the contents into a table that you own in your own Snowflake instance before the TTL expires.

IMPORTANT
Deleting a dataflow in Experience Platform will result in the dynamic table disappearing from your Snowflake account.

Audience update behavior audience-update-behavior

If your audience is evaluated in batch mode, the data in the shared table is refreshed every 24 hours. This means there may be a delay of up to 24 hours between changes in audience membership and when those changes are reflected in the shared table.

Batch data sharing logic batch-data-sharing

When a dataflow runs for an audience for the first time, it performs a backfill and shares all currently qualified profiles. After this initial backfill, the destination provides periodic snapshots of the complete audience membership. Each snapshot replaces the previous data in the shared table, ensuring that you always see the latest complete view of the audience without historical data.

Streaming versus batch data sharing batch-vs-streaming

Experience Platform provides two types of Snowflake destinations: Snowflake Streaming and Snowflake Batch.

While both destinations give you access to your data in Snowflake in a zero-copy manner, there are some recommended best practices in terms of use cases for each connector.

The table below will help you decide which connector to use by outlining the scenarios where each data sharing method is most appropriate.

Choose Snowflake Batch when you need
Choose Snowflake Streaming when you need
Update frequency
Periodic snapshots
Continuous updates in real-time
Data presentation
Complete audience snapshot that replaces previous data
Incremental updates based on profile changes
Use case focus
Analytical/ML workloads where latency is not critical
Immediate action scenarios requiring real-time updates
Data management
Always see latest complete snapshot
Incremental updates based on audience membership changes
Example scenarios
Business reporting, data analysis, ML model training
Marketing campaign suppression, real-time personalization

For more information about streaming data sharing, see the Snowflake Streaming connection documentation.

Use cases use-cases

Batch data sharing is ideal for scenarios where you need a complete snapshot of your audience and real-time updates are not required, such as:

  • Analytical workloads: When performing data analysis, reporting, or business intelligence tasks that require a complete view of audience membership
  • Machine learning workflows: For training ML models or running predictive analytics that benefit from complete audience snapshots
  • Data warehousing: When you need to maintain a current copy of audience data in your own Snowflake instance
  • Periodic reporting: For regular business reporting where you need the latest audience state without historical change tracking
  • ETL processes: When you need to transform or process audience data in batches

Batch data sharing simplifies data management by providing complete snapshots, eliminating the need to manage incremental updates or merge changes manually.

Supported audiences supported-audiences

This section describes which types of audiences you can export to this destination. The two tables below indicate which audiences this connector supports, by audience origin and profile types included in the audience:

Audience origin
Supported
Description
Segmentation Service
Audiences generated through the Experience Platform Segmentation Service.
All other audience origins

This category includes all audience origins outside of audiences generated through the Segmentation Service. Read about the various audience origins. Some examples include:

  • custom upload audiences imported into Experience Platform from CSV files,
  • look-alike audiences,
  • federated audiences,
  • audiences generated in other Experience Platform apps such as Adobe Journey Optimizer,
  • and more.

Supported audiences by audience data type:

Audience data type
Supported
Description
Use cases
People audiences
Based on customer profiles, allowing you to target specific groups of people for marketing campaigns.
Frequent buyers, cart abandoners
Account audiences
No
Target individuals within specific organizations for account-based marketing strategies.
B2B marketing
Prospect audiences
No
Target individuals who are not yet customers but share characteristics with your target audience.
Prospecting with third-party data
Dataset exports
No
Collections of structured data stored in the Adobe Experience Platform Data Lake.
Reporting, data science workflows

Export type and frequency export-type-frequency

Refer to the table below for information about the destination export type and frequency.

Item
Type
Notes
Export type
Audience export
You are exporting all members of an audience with the identifiers (name, phone number, or others) used in the Snowflake destination.
Export frequency
Batch
This destination provides periodic snapshots of complete audience membership through Snowflake data sharing. Each snapshot replaces the previous data, ensuring you always have the latest complete view of your audience.

Connect to the destination connect

IMPORTANT
To connect to the destination, you need the View Destinations and Manage Destinations access control permissions. Read the access control overview or contact your product administrator to obtain the required permissions.

To connect to this destination, follow the steps described in the destination configuration tutorial. In the configure destination workflow, fill in the fields listed in the two sections below.

Authenticate to destination authenticate

To authenticate to the destination, select Connect to destination and provide an account name and, optionally, an account description.

Sample screenshot showing how to authenticate to the destination

Fill in destination details destination-details

To configure details for the destination, fill in the required and optional fields below. An asterisk next to a field in the UI indicates that the field is required.

Sample screenshot showing how to fill in details for your destination

  • Name: A name by which you will recognize this destination in the future.

  • Description: A description that will help you identify this destination in the future.

  • Snowflake Account ID: Your Snowflake account ID. Use the following Account ID format depending on whether your account is linked to an organization:

    • If your account is linked to an organization:OrganizationName.AccountName.
    • If your account is not linked to an organization:AccountName.
  • Account acknowledgment: Toggle on the Snowflake Account ID acknowledgment to confirm that your Account ID is correct and it belongs to you.

IMPORTANT
Special characters used in the destination name and Experience Platform sandbox name are automatically converted to underscores (_) in Snowflake. To avoid confusion, do not use any special characters in your destination and sandbox name.

Enable alerts enable-alerts

You can enable alerts to receive notifications on the status of the dataflow to your destination. Select an alert from the list to subscribe to receive notifications on the status of your dataflow. For more information on alerts, read the guide on subscribing to destinations alerts using the UI.

When you are finished providing details for your destination connection, select Next.

Activate audiences to this destination activate

IMPORTANT

Read Activate audience data to batch profile export destinations for instructions on activating audiences to this destination.

Map attributes map

You can export identities and profile attributes to this destination.

Experience Platform user interface image showing the mapping screen for the Snowflake destination.

You can use the calculated fields control to export and perform operations on arrays.

The target attributes are automatically created in Snowflake using the attribute name that you provide in the Attribute name field.

Exported data / Validate data export exported-data

The data is staged into your Snowflake account via a dynamic table. Check your Snowflake account to verify that the data was exported correctly.

Data structure data-structure

The dynamic table contains the following columns:

  • TS: A timestamp column that represents when each row was last updated
  • Mapping attributes: Every mapping attribute that you select during the activation workflow is represented as a column header in Snowflake
  • Audience membership: Membership to any audience mapped to the dataflow is indicated via an active entry in the corresponding cell

Screenshot showing the Snowflake interface with dynamic table data

Known limitations known-limitations

Multiple merge policies

Audiences with multiple merge policies are not supported in a single dataflow. Different merge policies produce different snapshots, and in practice, data related to one audience would be overwritten by the data from the other audience, instead of data from both being exported as expected.

Data usage and governance data-usage-governance

All Adobe Experience Platform destinations are compliant with data usage policies when handling your data. For detailed information on how Adobe Experience Platform enforces data governance, read the Data Governance overview.

recommendation-more-help
7f4d1967-bf93-4dba-9789-bb6b505339d6