The RainFocus source is in beta. See the sources overview for more information on using beta-labeled sources.
This tutorial provides steps on how to connect your RainFocus account and stream event management and analytics data to Adobe Experience Platform.
This source connector and documentation page are created and maintained by the RainFocus team. For any inquiries or update requests, please contact them directly at firstname.lastname@example.org or visit the RainFocus Help Center
This tutorial requires a working understanding of the following components of Experience Platform:
Before you can connect your RainFocus account to Experience Platform, you must first complete the following prerequisite tasks:
Once you have completed the prerequisite setup, you can then proceed to the steps outlined below.
In the Platform UI, select Sources from the left navigation bar to access the sources workspace. The Catalog screen displays a variety of sources with which you can create an account.
You can select the appropriate category from the catalog on the left-hand side of your screen. Alternatively, you can find the specific source you wish to work with using the search option.
Under the Analytics category, select RainFocus Experience, and then select Add data.
The Select data step appears, providing an interface for you to select the data that you bring to Experience Platform.
Select Upload files to upload a JSON file from your local system. Alternatively, you can drag and drop the JSON file you want to upload into the Drag and drop files panel.
Upload the Sample JSON Payload downloaded from RainFocus.
Once your file uploads, the preview interface updates to display a preview of the schema you uploaded. The preview interface allows you to inspect the contents and structure of a file. You can also use the Search field utility to access specific items from within your schema.
When finished, select Next.
The Dataflow detail step appears, providing you with options to use an existing dataset or establish a new dataset for your dataflow, as well as an opportunity to provide a name and description for your dataflow. During this step, you can also configure settings for Profile ingestion, error diagnostics, partial ingestion, and alerts.
When finished, select Next.
The Mapping step appears, providing you with an interface to map the source fields from your source schema to their appropriate target XDM fields in the target schema.
Experience Platform provides intelligent recommendations for auto-mapped fields based on the target schema or dataset that you selected. You can manually adjust mapping rules to suit your use cases. Based on your needs, you can choose to map fields directly, or use data prep functions to transform source data to derive computed or calculated values. For comprehensive steps on using the mapper interface and calculated fields, see the Data Prep UI guide.
Once your source data is successfully mapped, select Next.
The Review step appears, allowing you to review your new dataflow before it is created. Details are grouped within the following categories:
Once you have reviewed your dataflow, select Finish and allow some time for the dataflow to be created.
With your streaming dataflow created, you can now retrieve your streaming endpoint URL. This endpoint will be used to subscribe to your webhook, allowing your streaming source to communicate with Experience Platform.
To retrieve your streaming endpoint, go to the Dataflow activity page of the dataflow that you just created and copy the endpoint from the bottom of the Properties panel.
Once your dataflow is complete and you have retrieved your streaming endpoint URL, you can now activate the Integration Profile in RainFocus.
By following this tutorial, you have established a connection for your RainFocus source, allowing you to stream your event management and analytics data to Experience Platform.
The following documents provide additional guidance on nuances surrounding the RainFocus source.