Activate audiences to file-based destinations by using the Flow Service API
Use the enhanced file export capabilities to access enhanced customization functionality when exporting files out of Experience Platform:
- Additional file naming options.
- Ability to set custom file headers in your exported files via the improved mapping step.
- Ability to select the file type of the exported file.
- Ability to customize the formatting of exported CSV data files.
This functionality is supported by the six cloud storage cards listed below:
This article explains the workflow required to use the Flow Service API to export qualified profiles from Adobe Experience Platform to one of the cloud storage locations linked above.
Getting started get-started
This guide requires a working understanding of the following components of Adobe Experience Platform:
- Experience Data Model (XDM) System: The standardized framework by which Experience Platform organizes customer experience data.
- Segmentation Service: Adobe Experience Platform Segmentation Service allows you to build audiences and generate audiences in Adobe Experience Platform from your Real-Time Customer Profile data.
- Sandboxes: Experience Platform provides virtual sandboxes which partition a single Platform instance into separate virtual environments to help develop and evolve digital experience applications.
The following sections provide additional information that you need to know in order to activate data to file-based destinations in Platform.
Required permissions permissions
To export profiles, you need the View Destinations, Activate Destinations, View Profiles, and View Segments access control permissions. Read the access control overview or contact your product administrator to obtain the required permissions.
To export identities, you need the View Identity Graph access control permission.
{width="100" modal="regular"}
Reading sample API calls reading-sample-api-calls
This tutorial provides example API calls to demonstrate how to format your requests. These include paths, required headers, and properly formatted request payloads. Sample JSON returned in API responses is also provided. For information on the conventions used in documentation for sample API calls, see the section on how to read example API calls in the Experience Platform troubleshooting guide.
Gather values for required and optional headers gather-values-headers
In order to make calls to Platform APIs, you must first complete the Experience Platform authentication tutorial. Completing the authentication tutorial provides the values for each of the required headers in all Experience Platform API calls, as shown below:
- Authorization: Bearer
{ACCESS_TOKEN}
- x-api-key:
{API_KEY}
- x-gw-ims-org-id:
{ORG_ID}
Resources in Experience Platform can be isolated to specific virtual sandboxes. In requests to Platform APIs, you can specify the name and ID of the sandbox that the operation will take place in. These are optional parameters.
- x-sandbox-name:
{SANDBOX_NAME}
All requests that contain a payload (POST
, PUT
, PATCH
) require an additional media type header:
- Content-Type:
application/json
API reference documentation api-reference-documentation
You can find accompanying reference documentation for all the API operations in this tutorial. Refer to the Flow Service - Destinations API documentation on the Adobe Developer website. We recommend that you use this tutorial and the API reference documentation in parallel.
Glossary glossary
For descriptions of the terms that you will be encountering in this API tutorial, read the glossary section of the API reference documentation.
Select destination where to export audiences select-destination
Before starting the workflow to export profiles, identify the connection spec and flow spec IDs of the destination to which you are intending to export audiences to. Use the table below for reference.
4fce964d-3f37-408f-9778-e597338a21ee
1a0514a6-33d4-4c7f-aff8-594799c47549
6d6b59bf-fb58-4107-9064-4d246c0e5bb2
752d422f-b16f-4f0d-b1c6-26e448e3b388
be2c3209-53bc-47e7-ab25-145db8b873e1
17be2013-2549-41ce-96e7-a70363bec293
10440537-2a7b-4583-ac39-ed38d4b848e8
cd2fc47e-e838-4f38-a581-8fff2f99b63a
c5d93acb-ea8b-4b14-8f53-02138444ae99
585c15c4-6cbf-4126-8f87-e26bff78b657
36965a81-b1c6-401b-99f8-22508f1e6a26
fd36aaa4-bf2b-43fb-9387-43785eeeb799
You need these IDs to construct various the flow service entities in the next steps of this tutorial. You also need to refer to parts of the connection spec itself to set up certain entities so you can retrieve the Connection Spec from Flow Service APIs. See the examples below of retrieving connection specs for all the destinations in the table:
Request
accordion | ||
---|---|---|
Retrieve connection spec for Amazon S3 | ||
|
Response
accordion | ||
---|---|---|
Amazon S3 - Connection spec | ||
|
Request
accordion | ||
---|---|---|
Retrieve connection spec for Azure Blob Storage | ||
|
Response
accordion | ||
---|---|---|
Azure Blob Storage - Connection spec | ||
|
Request
accordion | ||
---|---|---|
Retrieve connection spec for Azure Data Lake Gen 2(ADLS Gen2) | ||
|
Response
accordion | ||
---|---|---|
Azure Data Lake Gen 2(ADLS Gen2) - Connection spec | ||
|
Request
accordion | ||
---|---|---|
Retrieve connection spec for Data Landing Zone(DLZ) | ||
|
Response
accordion | ||
---|---|---|
Data Landing Zone(DLZ) - Connection spec | ||
|
Request
accordion | ||
---|---|---|
Retrieve connection spec for Google Cloud Storage | ||
|
Response
accordion | ||
---|---|---|
Google Cloud Storage - Connection spec | ||
|
Request
accordion | ||
---|---|---|
Retrieve connection spec for SFTP | ||
|
Response
accordion | ||
---|---|---|
SFTP - Connection spec | ||
|
Follow the steps below to set up an audience export dataflow to a cloud storage destination. For some steps, the requests and responses differ between the various cloud storage destinations. In those cases, use the tabs on the page to retrieve the requests and responses specific to the destination that you want to connect and export audiences to. Be sure to use the correct connection spec
and flow spec
for the destination you are configuring.
Create a Source Connection create-source-connection
After deciding which destination you are exporting audiences to, you need to create a source connection. The source connection represents the connection to the internal Experience Platform Profile store.
Request
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments when copy-pasting the request into your terminal of choice.
code language-shell |
---|
|
Response
code language-json |
---|
|
A successful response returns the ID (id
) of the newly created source connection and an etag
. Note down the source connection ID as you will need it later when creating the dataflow.
Create a base connection create-base-connection
A base connection securely stores the credentials to your destination. Depending on the destination type, the credentials needed to authenticate against that destination can vary. To find these authentication parameters, first retrieve the connection spec
for your desired destination as described in the section Select destination where to export audiences and then look at the authSpec
of the response. Reference the tabs below for the authSpec
properties of all supported destinations.
accordion | ||
---|---|---|
Amazon S3 - Connection spec showing auth spec | ||
Note the highlighted line with inline comments in the connection spec example below, which provide additional information about where to find the authentication parameters in the connection spec.
|
accordion | ||
---|---|---|
Azure Blob Storage - Connection spec showing auth spec | ||
Note the highlighted line with inline comments in the connection spec example below, which provide additional information about where to find the authentication parameters in the connection spec.
|
accordion | ||
---|---|---|
Azure Data Lake Gen 2(ADLS Gen2) - Connection spec showing auth spec | ||
Note the highlighted line with inline comments in the connection spec example below, which provide additional information about where to find the authentication parameters in the connection spec.
|
accordion | |||||
---|---|---|---|---|---|
Data Landing Zone(DLZ) - Connection spec showing auth spec | |||||
|
accordion | ||
---|---|---|
Google Cloud Storage - Connection spec showing auth spec | ||
Note the highlighted line with inline comments in the connection spec example below, which provide additional information about where to find the authentication parameters in the connection spec.
|
accordion | |||||
---|---|---|---|---|---|
SFTP - Connection spec showing auth spec | |||||
Note the highlighted line with inline comments in the connection spec example below, which provide additional information about where to find the authentication parameters in the connection spec.
|
Using the properties specified in the authentication spec (i.e. authSpec
from the response) you can create a base connection with the required credentials, specific to each destination type, as shown in the examples below:
Request
accordion | |||||
---|---|---|---|---|---|
Amazon S3 - Base connection request with access key and secret key authentication | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
accordion | |||||
---|---|---|---|---|---|
Amazon S3 - Base connection request with assumed role authentication | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
Amazon S3 Base connection response | ||
|
Request
accordion | |||||
---|---|---|---|---|---|
Azure Blob Storage - Base connection request | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
Azure Blob Storage - Base connection response | ||
|
Request
accordion | |||||
---|---|---|---|---|---|
Azure Data Lake Gen 2(ADLS Gen2) - Base connection request | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
Azure Data Lake Gen 2(ADLS Gen2) - Base connection response | ||
|
Request
accordion | |||||
---|---|---|---|---|---|
Data Landing Zone(DLZ) - Base connection request | |||||
|
Response
accordion | ||
---|---|---|
Data Landing Zone - Base connection response | ||
|
Request
accordion | |||||
---|---|---|---|---|---|
Google Cloud Storage - Base connection request | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
Google Cloud Storage - Base connection response | ||
|
Request
accordion | |||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
SFTP with password - Base connection request | |||||||||||||||||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
accordion | |||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
SFTP with SSH key - Base connection request | |||||||||||||||||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
SFTP - Base connection response | ||
|
Add encryption to exported files
Optionally, you can add encryption to your exported files. To do this, you need to add items from the encryptionSpecs
. See the request example below with the mandatory parameters highlighted:
code language-json line-numbers data-start-1 data-line-offset-4 h-26-27 |
---|
|
Request
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments when copy-pasting the request into your terminal of choice.
code language-shell line-numbers data-start-1 data-line-offset-4 h-19 |
---|
|
Response
code language-json |
---|
|
Note the connection ID from the response. This ID will be required in the next step when creating the target connection.
Create a target connection create-target-connection
Next, you need to create a target connection. Target connections store the export parameters for the exported audiences. Export parameters include export location, file format, compression, and other details. For example, for CSV files, you can select several export options. Get extensive information about all supported CSV export options in the file formatting configurations page.
Refer to the targetSpec
properties provided in the destination’s connection spec
to understand the supported properties for each destination type. Reference the tabs below for the targetSpec
properties of all supported destinations.
accordion | ||
---|---|---|
Amazon S3 - Connection spec showing target connection parameters | ||
Note the highlighted lines with inline comments in the connection spec example below, which provide additional information about where to find the target spec parameters in the connection spec. You can see also in the example below which target parameters are not applicable to audience export destinations.
|
accordion | ||
---|---|---|
Azure Blob Storage - Connection spec showing target connection parameters | ||
Note the highlighted lines with inline comments in the connection spec example below, which provide additional information about where to find the target spec parameters in the connection spec. You can see also in the example below which target parameters are not applicable to audience export destinations.
|
accordion | ||
---|---|---|
Azure Data Lake Gen 2(ADLS Gen2) - Connection spec showing target connection parameters | ||
Note the highlighted lines with inline comments in the connection spec example below, which provide additional information about where to find the target spec parameters in the connection spec. You can see also in the example below which target parameters are not applicable to audience export destinations.
|
accordion | ||
---|---|---|
Data Landing Zone(DLZ) - Connection spec showing target connection parameters | ||
Note the highlighted lines with inline comments in the connection spec example below, which provide additional information about where to find the target spec parameters in the connection spec. You can see also in the example below which target parameters are not applicable to audience export destinations.
|
accordion | ||
---|---|---|
Google Cloud Storage - Connection spec showing target connection parameters | ||
Note the highlighted lines with inline comments in the connection spec example below, which provide additional information about where to find the target spec parameters in the connection spec. You can see also in the example below which target parameters are not applicable to audience export destinations.
|
accordion | ||
---|---|---|
SFTP - Connection spec showing target connection parameters | ||
Note the highlighted lines with inline comments in the connection spec example below, which provide additional information about where to find the target spec parameters in the connection spec. You can see also in the example below which target parameters are not applicable to audience export destinations.
|
By using the above spec, you can construct a target connection request specific to your desired cloud storage destination, as shown in the tabs below.
Request
accordion | |||||
---|---|---|---|---|---|
Amazon S3 - Target connection request | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
accordion | |||||
---|---|---|---|---|---|
Amazon S3 - Target connection request with CSV options | |||||
|
Response
accordion | ||
---|---|---|
Target connection - Response | ||
|
Request
accordion | |||||
---|---|---|---|---|---|
Azure Blob Storage - Target connection request | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
accordion | |||||
---|---|---|---|---|---|
Azure Blob Storage - Target connection request with CSV options | |||||
|
Response
accordion | ||
---|---|---|
Target connection - Response | ||
|
Request
accordion | |||||
---|---|---|---|---|---|
Azure Data Lake Gen 2(ADLS Gen2) - Target connection request | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
accordion | |||||
---|---|---|---|---|---|
Azure Data Lake Gen 2(ADLS Gen2) - Target connection request with CSV options | |||||
|
Response
accordion | ||
---|---|---|
Target connection - Response | ||
|
Request
accordion | |||||
---|---|---|---|---|---|
Data Landing Zone - Target connection request | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
accordion | |||||
---|---|---|---|---|---|
Data Landing Zone - Target connection request with CSV options | |||||
|
Response
accordion | ||
---|---|---|
Target connection - Response | ||
|
Request
accordion | |||||
---|---|---|---|---|---|
Google Cloud Storage - Target connection request | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
accordion | |||||
---|---|---|---|---|---|
Google Cloud Storage - Target connection request with CSV options | |||||
|
Response
accordion | ||
---|---|---|
Target connection - Response | ||
|
Request
accordion | |||||
---|---|---|---|---|---|
SFTP - Target connection request | |||||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
accordion | |||||
---|---|---|---|---|---|
SFTP - Target connection request with CSV options | |||||
|
Response
accordion | ||
---|---|---|
Target connection - Response | ||
|
Note the target connection ID
from the response. This ID will be required in the next step when creating the dataflow to export audiences.
A successful response returns the ID (id
) of the newly target source connection and an etag
. Note down the target connection ID as you will need it later when creating the dataflow.
Create a dataflow create-dataflow
The next step in the destination configuration is to create a dataflow. A dataflow ties together previously created entities and also provides options for configuring the audience export schedule. To create the dataflow, use the payloads below, depending on your desired cloud storage destination, and replace the flow entity IDs from previous steps. Note that in this step, you are not adding any information related to attribute or identity mapping to the dataflow. That will follow in the next step.
Request
accordion | ||
---|---|---|
Create audience export dataflow to Amazon S3 destination - Request | ||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
Create dataflow - Response | ||
|
Request
accordion | ||
---|---|---|
Create audience export dataflow to Azure Blob Storage destination - Request | ||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
Create dataflow - Response | ||
|
Request
accordion | ||
---|---|---|
Create audience export dataflow to Azure Data Lake Gen 2(ADLS Gen2) destination - Request | ||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
Create dataflow - Response | ||
|
Request
accordion | ||
---|---|---|
Create audience export dataflow to Data Landing Zone destination - Request | ||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
Create dataflow - Response | ||
|
Request
accordion | ||
---|---|---|
Create audience export dataflow to Google Cloud Storage destination - Request | ||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
Create dataflow - Response | ||
|
Request
accordion | ||
---|---|---|
Create audience export dataflow to SFTP destination - Request | ||
Note the highlighted lines with inline comments in the request example, which provide additional information. Remove the inline comments in the request when copy-pasting the request into your terminal of choice.
|
Response
accordion | ||
---|---|---|
Create dataflow - Response | ||
|
Note the Dataflow ID from the response. This ID will be required in later steps.
Add audiences to the export
In this step, you can also select which audiences you want to export to the destination. For extensive information about this step and the request format to add an audience to the dataflow, view the examples in the Update a destination dataflow section of the API reference documentation.
Set up attribute and identity mapping attribute-and-identity-mapping
After creating your dataflow, you need to set up mapping for the attributes and identities that you would like to export. This consists of three steps, listed below:
- Create an input schema
- Create an output schema
- Set up a mapping set to connect the created schemas
For example, to obtain the following mapping shown in the UI, you would need to go through the three steps listed above and detailed in the next headings.
Create an input schema
To create an input schema, you first need to retrieve your union schema and the identities that can be exported to the destination. This is the schema of attributes and identities which you can select as your source mapping.
View below examples of requests and responses to retrieve attributes and identities.
Request to get attributes
code language-shell |
---|
|
Response
The response below has been shortened for brevity.
code language-json |
---|
|
Request to get identities
code language-shell |
---|
|
Response
The response returns the identities that you can use when creating the input schema. Note that this response returns both standard and custom identity namespaces that you set up in Experience Platform.
code language-json |
---|
|
Next, you need to copy the response from above and use it to create your input schema. You can copy the entire JSON response from the response above and place it into the jsonSchema
object indicated below.
Request to create input schema
code language-shell |
---|
|
Response
code language-json |
---|
|
The ID in the response represents the unique identifier of the input schema that you have created. Copy the ID from the response as you will reuse this in a later step.
Create an output schema
Next, you must set up the output schema for your export. First, you need to find and inspect your existing partner schema.
Request
Note that the example below uses the connection spec ID
for Amazon S3. Please replace this value with the connection spec ID specific to your destination.
code language-shell |
---|
|
Response with an example schema
Inspect the response you obtain when performing the call above. You need to drill down into the response to find the object targetSpec.attributes.partnerSchema.jsonSchema
code language-json |
---|
|
Next, you need to create an output schema. Copy the JSON response you got above and paste it into the jsonSchema
object below.
Request
code language-shell |
---|
|
Response
code language-json |
---|
|
The ID in the response represents the unique identifier of the input schema that you have created. Copy the ID from the response as you will reuse this in a later step.
Create mapping set create-mapping-set
Next, use the data prep API to create the mapping set by using the input schema ID, the output schema ID, and the desired field mappings.
Request
note important |
---|
IMPORTANT |
|
code language-shell line-numbers data-start-1 data-line-offset-4 h-16-38 |
---|
|
Response
code language-json |
---|
|
Note the ID of the mapping set as you will need it in the next step to update the existing dataflow with the mapping set ID.
Next, get the ID of the dataflow that you want to update.
See retrieve a destination dataflow’s details for information about retrieving the ID of a dataflow.
Finally, you need to PATCH
the dataflow with the mapping set information that you just created.
Request
code language-shell |
---|
|
Response
The response from the Flow Service API returns the ID of the updated dataflow.
code language-json |
---|
|
Make other dataflow updates other-dataflow-updates
To make any updates to your dataflow, use the PATCH
operation. For example, you can add a marketing action to your dataflows. Or, you can update your dataflows to select fields as mandatory keys or deduplication keys.
Add a marketing action add-marketing-action
To add a marketing action, see the request and response examples below.
If-Match
header is required when making a PATCH
request. The value for this header is the unique version of the dataflow you want to update. The etag value updates with every successful update of a flow entity such as dataflow, target connection, and others.https://platform.adobe.io/data/foundation/flowservice/flows/{ID}
endpoint, where {ID}
is the dataflow ID that you are looking to update.If-Match
header in double quotes like in the examples below when making PATCH
requests.Request
code language-shell |
---|
|
Response
A successful response returns response code 200
along with the ID of the updated dataflow and the updated eTag.
code language-json |
---|
|
Add a mandatory key add-mandatory-key
To add a mandatory key, see the request and response examples below.
If-Match
header is required when making a PATCH
request. The value for this header is the unique version of the dataflow you want to update. The etag value updates with every successful update of a flow entity such as dataflow, target connection, and others.https://platform.adobe.io/data/foundation/flowservice/flows/{ID}
endpoint, where {ID}
is the dataflow ID that you are looking to update.If-Match
header in double quotes like in the examples below when making PATCH
requests.Request
code language-shell |
---|
|
code language-shell |
---|
|
Response
code language-json |
---|
|
Add a deduplication key add-deduplication-key
To add a deduplication key, see the request and response examples below
If-Match
header is required when making a PATCH
request. The value for this header is the unique version of the dataflow you want to update. The etag value updates with every successful update of a flow entity such as dataflow, target connection, and others.https://platform.adobe.io/data/foundation/flowservice/flows/{ID}
endpoint, where {ID}
is the dataflow ID that you are looking to update.If-Match
header in double quotes like in the examples below when making PATCH
requests.Request
code language-shell |
---|
|
code language-shell |
---|
|
Response
code language-json |
---|
|
Validate dataflow (Get the dataflow runs) get-dataflow-runs
To check the executions of a dataflow, use the Dataflow Runs API:
Request
code language-shell |
---|
|
Response
code language-json |
---|
|
You can find information about the various parameters returned by the Dataflow runs API in the API reference documentation.
API error handling api-error-handling
The API endpoints in this tutorial follow the general Experience Platform API error message principles. Refer to API status codes and request header errors in the Platform troubleshooting guide for more information on interpreting error responses.
Next steps next-steps
By following this tutorial, you have successfully connected Platform to one of your preferred cloud storage destinations and set up a dataflow to the respective destination to export audiences. See the following pages for more details, such as how to edit existing dataflows using the Flow Service API: