[Beta]{class="badge informative"}
Test and Submit your source
The final steps to integrating your new source to Adobe Experience Platform using Self-Serve Sources (Streaming SDK) are to test and submit your new source. Once you have completed your connection specification and updated the streaming flow specification, you can start testing your source’s functionality through either the API or the UI. When successful, you can then submit your new source by contacting your Adobe representative.
The following document provides steps on how to test and debug your source using the Flow Service API.
Getting started
- For information on how to successfully make calls to Platform APIs, see the guide on getting started with Platform APIs.
- For information on how to generate your credentials for Platform APIs, see the tutorial on authenticating and accessing Experience Platform APIs.
- For information on how to set up Postman for Platform APIs, see the tutorial on setting up developer console and Postman.
- To help your testing and debugging process, download the Self-Serve Sources verification collection and environment here and follow the steps outlined below.
Test your source using the API
To test your source using the API, you must run the Self-Serve Sources verification collection and environment on Postman while providing the appropriate environment variables that pertain to your source.
To start testing, you must first set up the collection and environment on Postman. Next, specify the connection specification ID that you want to test.
flowSpecificationId
and targetConnectionSpecId
, which are fixed values.x-api-key
x-api-key
.c8d9a2f5c1e03789bd22e8efdd1bdc1b
x-gw-ims-org-id
x-gw-ims-org-id
information.ABCEH0D9KX6A7WA7ATQE0TE@adobeOrg
authorizationToken
authorizationToken
.Bearer authorizationToken
schemaId
https://ns.adobe.com/{TENANT_ID}.schemas.0ef4ce0d390f0809fad490802f53d30b
schemaVersion
application/vnd.adobe.xed-full-notext+json; version=1
schemaAltId
meta:altId
that is returned alongside the schemaId
when creating a new schema._{TENANT_ID}.schemas.0ef4ce0d390f0809fad490802f53d30b
dataSetId
5f3c3cedb2805c194ff0b69a
mappings
[{"destinationXdmPath":"person.name.firstName","sourceAttribute":"email.email_id","identity":false,"version":0},{"destinationXdmPath":"person.name.lastName","sourceAttribute":"email.activity.action","identity":false,"version":0}]
mappingId
bf5286a9c1ad4266baca76ba3adc9366
connectionSpecId
2e8580db-6489-4726-96de-e33f5f60295f
flowSpecificationId
GenericStreamingAEP
. This is a fixed value.e77fde5a-22a8-11ed-861d-0242ac120002
targetConnectionSpecId
c604ff05-7f1a-43c0-8e18-33bf874cb11c
verifyWatTimeInSecond
40
startTime
1597784298
Once you have provided all of your environment variables, you can start running the collection using the Postman interface. In the Postman interface, select the ellipses (…) beside Sources SSSs Verification Collection and then select Run collection.
The Runner interface appears, allowing you to configure the run order of your dataflow. Select Run SSS Verification Collection to run the collection.
Test your source using the UI
To test your source in the UI, go to the sources catalog of your organization’s sandbox in the Platform UI. From here, you should see your new source appear under the Streaming category.
With your new source now available in your sandbox, you must follow the sources workflow to test the functionalities. To begin, select Set up.
The Add data step appears. To test that your source can stream data, use the left side of the interface to upload a sample JSON data. Once your data is uploaded, the right side of the interface updates into a preview of the file hierarchy of your data. Select Next to proceed.
The Dataflow detail page allows you to select whether you want to use an existing dataset or a new dataset. During this process, you can also configure your data to be ingested to Profile, and enable settings like Error diagnostics and Partial ingestion.
For testing, select New dataset and provide an output dataset name. During this step, you can also provide an optional description to add further information to your dataset. Next, select a schema to map to using the Advanced search option or by scrolling through the list of existing schemas in the dropdown menu. Once you have selected a schema, provide a name and a description for your dataflow.
When finished, select Next.
The Mapping step appears, providing you with an interface to map the source fields from your source schema to their appropriate target XDM fields in the target schema.
Platform provides intelligent recommendations for auto-mapped fields based on the target schema or dataset that you selected. You can manually adjust mapping rules to suit your use cases. Based on your needs, you can choose to map fields directly, or use data prep functions to transform source data to derive computed or calculated values. For comprehensive steps on using the mapper interface and calculated fields, see the Data Prep UI guide
Once your source data is successfully mapped, select Next.
The Review step appears, allowing you to review your new dataflow before it is created. Details are grouped within the following categories:
- Connection: Displays your account name, type of source, and other miscellaneous information specific to the streaming cloud storage source you are using.
- Assign dataset and map fields: Displays the target dataset and schema you are using for your dataflow.
Once you have reviewed your dataflow, select Finish and allow some time for the dataflow to be created.
Finally, you must retrieve your dataflow’s streaming endpoint. This endpoint will be used to subscribe to your webhook, allowing your streaming source to communicate with Experience Platform. To retrieve your streaming endpoint, go to the Dataflow activity page of the dataflow that you just created and copy the endpoint from the bottom of the Properties panel.
Submit your source
Once your source is able to complete the entire workflow you can proceed to contact your Adobe representative and submit your source for integration across other Experience Platform organizations.