[Beta]{class="badge informative"}

Connect Capillary Streaming Events to Experience Platform using the UI

AVAILABILITY
The Capillary Streaming Events source is in beta. Read the terms and conditions in the sources overview for more information on using beta-labeled sources.

Read this guide to learn how to connect your Capillary database to Adobe Experience Platform using the sources workspace in the Experience Platform user interface.

Getting started

This tutorial requires a working understanding of the following components of Experience Platform:

In the Experience Platform UI, select Sources from the left navigation to access the Sources workspace. Select the appropriate category in the Categories panel Alternatively, use the search bar to navigate to the specific source that you want to use.

To use Capillary, select the Capillary Streaming Events source card under Loyalty and then select Add data.

TIP
Sources in the sources catalog display the Set up option when a given source does not yet have an authenticated account. Once an authenticated account is created, this option changes to Add data.

The sources catalog in the UI with the Capillary Streaming Events card selected.

Select data

Next, use the Select data interface to upload a sample JSON file to define your source schema. During this step, you can use the preview interface to view the file structure of the payload. When finished, select Next.

The select data step of the sources workflow

Dataflow details

Next, you must provide information regarding your dataset and your dataflow.

Dataset details

A dataset is a storage and management construct for a collection of data, typically a table, that contains a schema (columns) and fields (rows). Data that is successfully ingested into Experience Platform is persisted within the data lake as datasets.

During this step, you can either use an existing dataset or create a new dataset.

NOTE
Regardless of whether you use an existing dataset or create a new dataset, you must ensure that your dataset is enabled for Profile ingestion.
Select for steps to enable Profile ingestion, error diagnostics, and partial ingestion.

If your dataset is enabled for Real-Time Customer Profile, then during this step, you can toggle Profile dataset to enable your data for Profile-ingestion. You can also use this step to enable Error diagnostics and Partial ingestion.

  • Error diagnostics: Select Error diagnostics to instruct the source to produce error diagnostics that you can later reference when monitoring your dataset activity and dataflow status.
  • Partial ingestion: Partial batch ingestion is the ability to ingest data containing errors, up to a certain configurable threshold. This feature allows you to successfully ingest all of your accurate data into Experience Platform, while all of your incorrect data is batched separately with information on why it is invalid.

Dataflow details

Once your dataset is configured, you must then provide details on your dataflow, including a name, an optional description, and alert configurations.

The dataflow details interface

Dataflow configurations
Description
Dataflow name
The name of the dataflow. By default, this will use the name of the file that is being imported.
Description
(Optional) A brief description of your dataflow.
Alerts

Experience Platform can produce event-based alerts which users can subscribe to, these options allow a running dataflow to trigger these. For more information, read the alerts overview

  • Sources Dataflow Run Start: Select this alert to receive a notification when your dataflow run begins.
  • Sources Dataflow Run Success: Select this alert to receive a notification if your dataflow ends without any errors.
  • Sources Dataflow Run Failure: Select this alert to receive a notification if your dataflow run ends with any errors.

Mapping

Use the mapping interface to map your source data to the appropriate schema fields before ingesting data to Experience Platform. For more information, read the mapping guide in the UI.

TIP
You can download the Events and Profile mappings for Capillary and import the files to Data Prep when you are ready to map your data.

The mapping interface for Capillary.

Review

The Review step appears, allowing you to review the details of your dataflow before it is created. Details are grouped within the following categories:

  • Connection: Shows the account name, source platform, and the source name.
  • Assign dataset and map fields: Shows the target dataset and the schema that the dataset adheres to.

After confirming the details are correct, select Finish.

The review step in the sources workflow.

Retrieve the streaming endpoint URL

With the connection created, the sources detail page appears. This page shows details of your newly created connection, including previously run dataflows, ID, and streaming endpoint URL.

The streaming endpoint URL.

recommendation-more-help
337b99bb-92fb-42ae-b6b7-c7042161d089