Configure the Azure Blob Destination
Last update: April 30, 2025
- Topics:
- Destinations
CREATED FOR:
- Intermediate
- User
- Admin
Learn how to set up a connection and send data to the Azure Blob Storage destination in Real-Time Customer Data Platform. This destination supports exporting datasets and audiences, and allows you to customize the file headers and data attributes.
For more information, please visit the documentation.
Transcript
In this video, I’ll show you how to configure a connection to the Microsoft Azure Blob destination and then send data using the Experience Platform user interface. Let’s get started with establishing the connection first. I’m logged into Experience Platform and selected Destinations below Connections. This opens the Catalog view. Next I’ll scroll to the Categories section and select Cloud Storage. Here is the destination card for the Azure Blob storage. Since I don’t have any previous connections saved, I’ll select Setup. This opens the workflow beginning with configuring the connection through sending data to the storage account. The first thing I need to do is enter the connection string. I’ll quickly switch to a slide that shows you where to get this from the Azure portal once you have a storage account configured. When you open the storage account, select Access Keys under Security & Networking. Show the connection string and copy it. It starts with default endpoints protocol and make sure you copy the entire string. Now I’ll paste this string into the text field in the workflow. Adding an encryption key to attach to your exported files is optional. I won’t be doing this for my demonstration, but I highly recommend you do. You can install open source tools that will take care of creating public and private keys, but it’s the public key value that’s copied here. To finish the connection, I’ll click on Connect to Destination. The connection is now confirmed. You’d receive an error message if it’s unsuccessful. Additional fields now appear on this configuration view. These are the storage account destination details. This destination supports sending datasets, prospects, and audiences from Experience Platform. There’s a separate video about sending datasets if you’re interested in learning about that. I’ll be sending an audience for my demo, so I’ll keep that selection. Then I’ll fill in the name, description, folder path, and container fields. The last two fields specify the destination folder and container from the storage account. You can get these values from the Azure portal. For the field type, I have two options, JSON and Parquet. I’ll choose JSON. Towards the bottom, I can choose a compression format, which I’ll do now. The choice is relative to the file type selected above. If you’re working with the production sandbox, you’ll have the option to set up alerts. I’m finished with these inputs, so I’ll select Next in the top right to move forward. This step of Configure New Destination is prompting me to select the marketing action appropriate for this connection. I’ll choose Email Targeting from the list and then Next at the top. Because I chose audiences in the destination detail step, I’m presented with the list of audiences in my Experience Platform sandbox. If you have a lot of audiences in your sandbox, you can use the search bar to filter the list. Since I’m using a development sandbox, I don’t have many. I’ll choose the Luma customers with level gold or above, then I’ll select Next to move forward. On the Scheduling step, I can specify whether I’m exporting a new audience, or audiences, or those that have already been activated. Let’s review the scheduling options. Here you can decide whether you want to export the full file as a one-time operation, or export incremental files that contain new data for people who become part of the audience over time. Next, I can choose a frequency setting and a start time. You can specify a precise time or send the data after Experience Platform processes segment evaluation. This option ensures that the most up-to-date audience profiles are exported after the daily batch segmentation job finishes. I could customize the start date as well. You can modify the file name shared to the storage account. The first item is a file name preview, and below that there’s a lot of flexibility to append settings to the file name, or choose a different date and time setting to append, or add custom text. I’m going to keep the standard file name for my demo, so I’ll cancel out of this step. Once I’m done with the scheduling settings, I’ll go to the next step. The Mapping step lets you customize the data going out. It presents you with recommendations to map the source field, which is the XDM source field from the schema, to the attribute fields. You can add a calculated field as well. I encourage you to explore all of the options available to you there. You can also add a new mapping field. If you want to send additional attributes beyond the recommended field mappings, you do that by opening this modal and choosing the fields. I’m going to leave most of the recommended fields, except the last field, which I’ll remove by clicking the Delete Mapping icon next to it. Mandatory attributes ensure that exported profiles contain specific attributes, like email address, and only those profiles with the attributes are exported. The duplication keys, on the other hand, help identify and handle duplicate records, allowing you to specify fields to identify duplicates and choose how many unique duplicates to keep. I won’t make any further changes, so I’ll select Next above. This is the review step where I can verify my settings. Once everything looks good, I’ll select Finish in the upper right corner. That’s it. I receive a confirmation that my destination has been successfully saved. After this, you would connect to the Azure Storage account to confirm that you see the files there. This concludes the demo for configuring and sending Experience Platform data to the Azure Blob destination. Thanks for watching!
Previous pageIntegrate with Google Customer Match
Next pageConfigure the Marketo destination
Real-Time Customer Data Platform
- Platform Tutorials
- Introduction to Platform
- A customer experience powered by Experience Platform
- Behind the scenes: A customer experience powered by Experience Platform
- Experience Platform overview
- Key capabilities
- Platform-based applications
- Integrations with Experience Cloud applications
- Key use cases
- Basic architecture
- User interface
- Roles and project phases
- Introduction to Real-Time CDP
- Getting started: Data Architects and Data Engineers
- Import sample data to Experience Platform
- Administration
- AI Assistant
- APIs
- Audiences and Segmentation
- Introduction to Audience Portal and Composition
- Upload audiences
- Overview of Federated Audience Composition
- Connect and configure Federated Audience Composition
- Create a Federated Audience Composition
- Audience rule builder overview
- Create audiences
- Use time constraints
- Create content-based audiences
- Create conversion audiences
- Create audiences from existing audiences
- Create sequential audiences
- Create dynamic audiences
- Create multi-entity audiences
- Create and activate account audiences (B2B)
- Demo of streaming segmentation
- Evaluate batch audiences on demand
- Evaluate an audience rule
- Create a dataset to export data
- Segment Match connection setup
- Segment Match data governance
- Segment Match configuration flow
- Segment Match pre-share insights
- Segment Match receiving data
- Audit logs
- Data Collection
- Collaboration
- Dashboards
- Data Governance
- Data Hygiene
- Data Ingestion
- Overview
- Batch ingestion overview
- Create and populate a dataset
- Delete datasets and batches
- Map a CSV file to XDM
- Sources overview
- Ingest data from Adobe Analytics
- Ingest data from Audience Manager
- Ingest data from cloud storage
- Ingest data from CRM
- Ingest data from databases
- Streaming ingestion overview
- Stream data with HTTP API
- Stream data using Source Connectors
- Web SDK tutorials
- Mobile SDK tutorials
- Data Lifecycle
- Destinations
- Destinations overview
- Connect to destinations
- Create destinations and activate data
- Activate profiles and audiences to a destination
- Export datasets using a cloud storage destination
- Integrate with Google Customer Match
- Configure the Azure Blob destination
- Configure the Marketo destination
- Configure file-based cloud storage or email marketing destinations
- Configure a social destination
- Activate through LiveRamp destinations
- Adobe Target and Custom Personalization
- Activate data to non-Adobe applications webinar
- Identities
- Intelligent Services
- Monitoring
- Partner data support
- Profiles
- Understanding Real-Time Customer Profile
- Profile overview diagram
- Bring data into Profile
- Customize profile view details
- View account profiles
- Create merge policies
- Union schemas overview
- Create a computed attribute
- Pseudonymous profile expirations (TTL)
- Delete profiles
- Update a specific attribute using upsert
- Privacy and Security
- Introduction to Privacy Service
- Identity data in Privacy requests
- Privacy JavaScript library
- Privacy labels in Adobe Analytics
- Getting started with the Privacy Service API
- Privacy Service UI
- Privacy Service API
- Subscribe to Privacy Events
- Set up customer-managed keys
- 10 considerations for Responsible Customer Data Management
- Elevating the Marketer’s Role as a Data Steward
- Queries and Data Distiller
- Overview
- Query Service UI
- Query Service API
- Explore Data
- Prepare Data
- Adobe Defined Functions
- Data usage patterns
- Run queries
- Generate datasets from query results
- Tableau
- Analyze and visualize data
- Build dashboards using BI tools
- Recharge your customer data
- Connect clients to Query Service
- Validate data in the datalake
- Schemas
- Overview
- Building blocks
- Plan your data model
- Convert your data model to XDM
- Create schemas
- Create schemas for B2B data
- Create classes
- Create field groups
- Create data types
- Configure relationships between schemas
- Use enumerated fields and suggested values
- Copy schemas between sandboxes
- Update schemas
- Create an ad hoc schema
- Sources
- Use Case Playbooks
- Experience Cloud Integrations
- Industry Trends