Authentication parameters such as the customer ID, the private key, and the authentication endpoint are configured in the instance configuration files.
The list of triggers to be processed is configured in an option in JSON format.
The triggers are used for targeting by a campaign workflow that sends emails. The campaign is set up so that a customer that has both trigger events receives an email.
Before starting this configuration, please check you are using:
You also need:
Authentication is required since pipeline is hosted in the Adobe Experience Cloud.
It uses a pair of public and private keys. This process has the same function as a user/password but is more secure.
Authentication is supported for the Marketing Cloud via Adobe I/O Project.
For Hosted customers, you can create a customer care ticket to enable your organization with Adobe I/O Technical Account Tokens for the Triggers integration.
For On Premise customers, refer to the Configuring Adobe I/O for Adobe Experience Cloud Triggers page. Note that you need to select Adobe Analytics while adding API to the Adobe I/O credential.
Once the authentication is set, pipeline will retrieve the events. It will only process triggers that are configured in Adobe Campaign. The trigger must have been generated from Adobe Analytics and pushed to the pipeline which will only process triggers that are configured in Adobe Campaign.
The option can also be configured with a wildcard in order to catch all triggers regardless of the name.
In Adobe Campaign, access the options menu under Administration > Platform > Options in the Explorer.
Select the NmsPipeline_Config option.
In the Value (long text) field, you can paste the following JSON code, which specifies two triggers. You need to make sure to remove comments.
{
"topics": [ // list of "topics" that the pipelined is listening to.
{
"name": "triggers", // Name of the first topic: triggers.
"consumer": "customer_dev", // Name of the instance that listens. This value can be found on the monitoring page of Adobe Campaign.
"triggers": [ // Array of triggers.
{
"name": "3e8a2ba7-fccc-49bb-bdac-33ee33cf02bf", // TriggerType ID from Analytics
"jsConnector": "cus:triggers.js" // Javascript library holding the processing function.
}, {
"name": "2da3fdff-13af-4c51-8ed0-05802a572e94", // Second TriggerType ID
"jsConnector": "cus:triggers.js" // Can use the same JS for all.
},
]
}
]
}
You can also choose to paste the following JSON code which catches all triggers.
{
"topics": [
{
"name": "triggers",
"consumer": "customer_dev",
"triggers": [
{
"name": "*",
"jsConnector": "cus:pipeline.js"
}
]
}
]
}
The pipeline works like a supplier and consumer model. Messages are consumed only for an individual consumer: each consumer gets its own copy of the messages.
The Consumer parameter identifies the instance as one of these consumers. The identity of the instance will call the pipeline. You can fill it with the instance name which can be found on the Monitoring page of the Client Console.
The pipeline service keeps track of the messages retrieved by each consumer. Using different consumers for different instances allows you to make sure that every message is sent to each instance.
To configure Pipeline option, you should follow these recommendations:
You can change some internal parameters as per your load requirements but make sure to test them before putting them into production.
The list of optional parameters can be found below:
Option | Description |
---|---|
appName(Legacy) | AppID of the OAuth application registered in the Legacy Oath application where the public key was uploaded. For more on this, refer to this page |
authGatewayEndpoint(Legacy) | URL to get gateway tokens. Default: https://api.omniture.com |
authPrivateKey(Legacy) | The private key, public part uploaded in the Legacy Oath application, AES encrypted with the XtkKey option: cryptString("PRIVATE_KEY") |
disableAuth(Legacy) | Disable authentication, connecting without gateway tokens will only be accepted by some development Pipeline endpoints. |
discoverPipelineEndpoint | URL to find the Pipeline Services endpoint to be used for this tenant. Default: https://producer-pipeline-pnw.adobe.net |
dumpStatePeriodSec | Period between two dumps of the internal state process in var/INSTANCE/pipelined.json. Internal state is also accessible on-demand here: http://INSTANCE:7781/pipelined/status |
forcedPipelineEndpoint | Disable the detection of the PipelineServicesEndpoint to force it |
monitorServerPort | The pipelined process will listen on this port to provide the internal state process here: http://INSTANCE:PORT/pipelined/status . Default is 7781 |
pointerFlushMessageCount | When this number of messages is processed, the offsets will be saved in the database. Default is 1000 |
pointerFlushPeriodSec | After this period, the offsets will be saved in the database. Default is 5 (secs) |
processingJSThreads | Number of dedicated threads processing messages with custom JS connectors. Default is 4 |
processingThreads | Number of dedicated threads processing messages with built-in code. Default is 4 |
retryPeriodSec | Delay between retries in case of processing errors. Default is 30 (secs) |
retryValiditySec | Discard the message if it is not successfully processed after this period (too many retries). Default is 300 (secs) |
The pipelined process needs to be started automatically.
For this, set the < pipelined > element in the config file to autostart=“true”:
<pipelined autoStart="true" ... "/>
A restart is required for the changes to take effect:
nlserver restart pipelined@instance
To validate the pipeline setup for provisioning, follow the steps below: