Offer decisioning
This guide provides a comprehensive implementation reference for offer decisioning using Adobe Journey Optimizer (AJO) Decisioning and Adobe Real-Time Customer Data Platform (RT-CDP). It is designed for solution architects, marketing technologists, and implementation engineers who need to implement centralized offer selection logic that determines the next-best offer for each customer profile across channels.
Use this guide to understand what needs to be configured, where choices exist, and what trade-offs apply to each decision.
The pattern decouples the “what to show” decision from the “where to show it” channel logic, enabling consistent, optimized offer selection across email, web, mobile app, and any other touchpoint. AJO Decisioning manages the full offer lifecycle: offer creation and catalog management, eligibility rules (who can see each offer), ranking strategies (how to select among eligible offers), placements (where offers appear), and decision policies (which bind everything together).
Use case overview
Organizations frequently need to present the most relevant offer, promotion, or incentive to each customer at the moment of interaction. Whether the interaction occurs in an email campaign, on a website homepage, within a mobile app, or at a decision point within a multi-step journey, the challenge is the same: select the optimal offer from a catalog of available options based on who the customer is, what they qualify for, and which offer is most likely to drive the desired outcome.
Offer decisioning addresses this by centralizing all offer selection logic in AJO’s Decision Management engine. Rather than hardcoding offer assignments into individual campaigns or channels, the decision engine evaluates each profile’s attributes, audience membership, and contextual signals to determine the best offer in real time. This centralization ensures that the same customer receives consistent, optimized offers regardless of which channel they engage through.
This pattern differs from known-visitor web/app personalization in scope – offer decisioning is channel-agnostic and centralized, while known-visitor personalization focuses on digital surface personalization. It differs from behavioral recommendation in catalog model – use offer decisioning when the eligible item set is governed by business rules, eligibility constraints, or regulatory requirements (promotions, financial products, incentives). Use behavioral recommendation when the item set is large, continuously changing, and selection is driven by behavioral similarity or affinity signals (product catalogs, content libraries).
Key business objectives
The following business objectives are supported by this use case pattern.
Deliver personalized customer experiences
Tailor content, offers, and messaging to individual preferences, behaviors, and lifecycle stage.
KPIs: Engagement, Conversion Rates, Customer Satisfaction (CSAT)
Drive cross-sell & upsell revenue
Promote complementary and premium products or services to existing customers based on behavior and purchase history.
KPIs: Upsell/Cross Sell %, Incremental Revenue, Customer Lifetime Value
Increase customer loyalty & lifetime value
Deepen customer relationships and maximize long-term value through loyalty programs, rewards, and personalized engagement.
KPIs: Customer Lifetime Value, Retention, Upsell/Cross Sell %
Example tactical use cases
The following scenarios illustrate how offer decisioning can be applied in practice.
- Next-best-offer in email campaigns – select the most relevant promotion per recipient at send time
- Real-time promotional banner on website – decisioning selects the offer at page load based on the visitor’s profile
- Personalized in-app card with the best incentive for the user’s lifecycle stage
- Cross-channel offer consistency – same decisioning logic serves email, web, and push so the customer sees a unified offer experience
- Dynamic coupon or discount selection based on customer value tier (e.g., high-value customers receive a premium offer)
- Product upgrade or upsell offer selection based on current subscription level
- Loyalty reward offer personalization based on tier and activity history
Key performance indicators
The following KPIs help measure the effectiveness of an offer decisioning implementation.
Use case pattern
This section describes the function chain and pattern definition for offer decisioning.
Offer decisioning
Use centralized decision logic to select the next-best offer or content for a profile across channels.
Function chain: Audience Evaluation > Offer Eligibility > Ranking Strategy > Decision Execution > Delivery > Reporting
See the Implementation options section for how each composition manifests.
Applications
The following Adobe applications are used in this use case pattern.
- Adobe Journey Optimizer (AJO) – Decision Management engine for offer creation, eligibility rules, ranking strategies, placements, and decision policies; channel configuration and message authoring for offer delivery; campaign and journey execution
- Adobe Real-Time Customer Data Platform (RT-CDP) – Audience evaluation for offer eligibility segments; profile data and computed attributes used in eligibility and ranking
- Adobe Experience Platform (AEP) – Unified profile store, identity resolution, and data foundation supporting both AJO and RT-CDP
Foundational functions
The following foundational capabilities must be in place for this use case pattern. For each function, the status indicates whether it is typically required, assumed to be pre-configured, or not applicable.
Supporting functions
The following capabilities augment this use case pattern but are not required for core execution.
Application functions
This plan exercises the following functions from the Application Function Catalog. Functions are mapped to implementation phases rather than numbered steps.
Journey Optimizer (AJO)
The following table lists AJO functions and the implementation phases where they are configured.
Real-Time CDP (RT-CDP)
The following table lists RT-CDP functions and the implementation phases where they are configured.
Prerequisites
Complete the following prerequisites before beginning implementation.
- [ ] AJO sandbox with Decision Management capabilities enabled
- [ ] User roles with Decision Management permissions (create/edit offers, placements, decisions)
- [ ] Profile schema includes attributes required for offer eligibility (e.g., loyalty tier, customer segment, subscription level)
- [ ] Profile data is current and actively ingested for eligibility attribute freshness
- [ ] For Option A (Email): Email channel surface configured with verified subdomain and warmed IP pool
- [ ] For Option B (Web/App): Web SDK implemented with AJO service enabled on the datastream; edge-active merge policy configured
- [ ] For Option C (Journey): Journey canvas permissions and at least one journey entry event or audience configured
- [ ] Offer creative assets (images, copy, CTAs) prepared for each offer and placement combination
- [ ] Fallback offer content prepared for each placement
- [ ] Audiences for offer eligibility rules defined and evaluated in RT-CDP
Implementation options
This section describes the available implementation options for offer decisioning. Each option serves a different delivery channel and use case context.
Option A: Email offer decisioning
This option is best for selecting the best offer to include in outbound email campaigns – promotional emails, newsletter personalization, lifecycle emails with dynamic offer content. The decision is made at message rendering time for each recipient.
How it works
Decision policies are invoked during email message rendering to select the best offer for each recipient. The email template includes an offer placement zone where the decisioning engine inserts the selected offer’s content representation (image, HTML, or text). At send time, the engine evaluates each recipient’s profile against offer eligibility rules, applies the ranking strategy, and embeds the winning offer’s content into the email.
This approach works with both scheduled campaigns (evaluated at campaign execution time) and journey-embedded emails (evaluated when the profile reaches the message action node). The offer content – image, headline, body copy, and CTA – is personalized per recipient based on the decision outcome.
Key considerations
- Offer eligibility is evaluated at send time using the profile’s current state
- Batch audience evaluation is sufficient since decisions happen during message rendering
- Each offer needs an HTML or image content representation for the email placement
- Fallback offer must have content for every email placement used
Advantages
- Simplest implementation path – uses standard campaign or journey email delivery
- No client-side SDK requirements
- Works with existing email infrastructure and channel surfaces
- Supports large audience volumes through batch campaign execution
Limitations
- Decision is made at send time; cannot adapt to post-send behavior
- Offer content is static once the email is delivered (no real-time updates)
- Limited to profile attributes available in the hub profile store (not edge)
Experience League resources
Option B: Web/app real-time offer decisioning
This option is best for real-time offer selection on web pages or mobile apps – homepage promotional banners, account dashboard offer widgets, in-app offer cards, or any digital surface where the offer should be selected at the moment the page loads or the screen renders.
How it works
Decision policies are invoked at page load or app screen render via the Edge Decisioning network or code-based experiences. When a visitor loads a page, the Web SDK sends a request to the Edge Network, which evaluates the visitor’s edge profile against offer eligibility rules and ranking strategies. The selected offer is returned in the response and rendered in the configured placement on the digital surface.
For code-based experiences, the application retrieves the decision response and renders the offer content using custom front-end logic. For web channel experiences, the AJO web channel can inject the offer content directly into the page using visual or code-based authoring.
Key considerations
- Requires Web SDK or Mobile SDK implementation with AJO service enabled on the datastream
- Edge-active merge policy is required for real-time profile lookups
- Audiences used for eligibility must support edge evaluation (simple attribute checks and segment membership)
- Offer content representations should use JSON or image URL formats for client-side rendering
- Impression tracking must be implemented to capture offer views and clicks
Advantages
- Real-time, in-session offer selection based on the visitor’s current profile state
- Sub-second decision latency via Edge Network
- Offers adapt to the most current profile data available at the edge
- Supports A/B testing of offer strategies via content experimentation
Limitations
- Requires client-side SDK implementation (Web SDK or Mobile SDK)
- Edge profile has a subset of full hub profile attributes – complex eligibility rules may not evaluate correctly
- Edge segments have segment rule complexity restrictions (no time-series queries)
- Requires front-end development for custom rendering in code-based experiences
Experience League resources
How this differs from Known-visitor web/app personalization Option B:
The infrastructure is identical — both use AJO Decisioning at the edge with Web SDK and an edge-active merge policy. The difference is the catalog governance model. This option governs a bounded offer catalog with eligibility rules, capping counters, and validity dates — use it when business or regulatory constraints determine which offers can be shown and how often. Known-visitor web/app personalization Option B selects from content items using segment membership or ranking strategies without offer lifecycle management. If your item set is large, continuously changing, and does not require capping or eligibility governance, use Known-visitor Option B instead.
Option C: Journey decision node
This option is best for offer selection within a multi-step journey – selecting the best offer at a decision point in a customer journey, then delivering it through the next action node. Use this when the offer decision is part of a broader orchestration flow with waits, conditions, and multiple message actions.
How it works
Decision policies are invoked from a decision node within an AJO journey canvas. When a profile reaches the decision node, the engine evaluates offer eligibility and ranking to select the optimal offer. The selected offer informs the next message action – which offer content to include, which channel to use, or which journey branch to take based on the offer outcome.
This approach enables adaptive journeys where the offer decision influences subsequent journey steps. For example, a journey might select the best offer, deliver it via email, wait for engagement, and then follow up with a push notification if the offer was not opened.
Key considerations
- The journey must be designed with a decision node followed by one or more message action nodes
- Offer eligibility is evaluated using the profile’s state at the moment the profile reaches the decision node
- Journey wait steps between the decision and delivery can cause the profile’s state to change
- Can combine with journey branching to take different paths based on which offer was selected
Advantages
- Integrates offer selection into multi-step orchestration flows
- Enables adaptive journeys where the offer choice influences subsequent steps
- Supports cross-channel delivery within the same journey (email, push, SMS)
- Can combine with journey conditions for post-offer engagement tracking
Limitations
- More complex to set up than standalone campaign decisioning
- Journey throughput limits apply (5,000 profiles per second entry rate)
- Decision is tied to the journey context – changes require journey versioning
- Journey must be republished for offer catalog or decision policy updates to take effect
Experience League resources
Option comparison
The following table compares the three implementation options across key criteria.
Choose the right option
Use the following guidance to select the best implementation option for your use case.
- Choose Option A if the primary use case is selecting the best offer per recipient in outbound email campaigns and no client-side SDK is available. This is the simplest implementation path and works well for promotional emails, newsletters, and lifecycle campaigns.
- Choose Option B if offers must be selected in real time at the moment a visitor loads a web page or opens a mobile app. This requires Web SDK or Mobile SDK and an edge-active merge policy but delivers the fastest, most contextual offer selection.
- Choose Option C if the offer decision is part of a broader customer journey with multiple steps, waits, and conditional branching. This is the right choice when the selected offer should influence downstream journey actions or when multi-channel follow-up based on offer engagement is needed.
- Combine options when offers must be delivered consistently across channels. Use the same decision policy across all three options to ensure a customer sees the same offer in email (Option A), on the website (Option B), and within a journey follow-up (Option C).
Implementation phases
The following phases outline the end-to-end implementation sequence for offer decisioning.
Phase 1: Validate foundational prerequisites
Application function: AEP: Data Modeling & Preparation, AEP: Identity & Profile Configuration
This phase validates that the foundational data layer supports offer decisioning. Profile schemas must include the attributes used in offer eligibility rules, and identity configuration must enable cross-channel profile resolution.
Decision: Profile attributes for eligibility
Determine which profile attributes will be used in offer eligibility rules.
Key configuration details
- Verify the profile schema includes fields referenced in eligibility rules (e.g.,
_tenantId.loyaltyTier,_tenantId.subscriptionType) - Confirm offer interaction tracking schema exists for impression, click, and conversion events
- For Option B: Verify edge-active merge policy is configured and Web SDK datastream has AJO service enabled
Experience League documentation
Phase 2: Configure audience evaluation
Application function: RT-CDP: Audience Evaluation
This phase defines and evaluates the audiences used as offer eligibility criteria. These audiences determine which customer segments qualify for specific offers (e.g., “high-value customers” qualify for premium offers, “trial users” qualify for conversion offers).
Decision: Audience evaluation method
Determine how quickly audience membership must update for offer eligibility.
UI navigation: Customer > Audiences > Create audience > Build rule
Key configuration details
- Define targeting audiences for offer eligibility (e.g., “Loyalty Gold Tier,” “High-Value Customers,” “Trial Users”)
- Define suppression audiences if needed (e.g., “Recently Received Offer X”)
- For Option B: Verify that eligibility audiences qualify for edge evaluation – avoid time-series queries and complex aggregations in segment rule expressions
Where options diverge
For Option A (Email Decisioning):
Batch or streaming evaluation is sufficient. Audiences are evaluated before or during campaign execution. Complex segment rule expressions including time-based conditions and event aggregations are fully supported.
For Option B (Web/App Real-Time):
Edge evaluation is required. Audiences must use simple attribute checks or segment membership conditions. Test edge eligibility by verifying the segment rule expression qualifies for edge segmentation.
For Option C (Journey Decision Node):
Any evaluation method works depending on the journey entry criteria. If the journey uses audience-based entry, the audience evaluation method matches the journey’s requirements.
Experience League documentation
Phase 3: Set up decisioning
Application function: AJO: Decisioning
This is the core phase where the offer catalog, eligibility rules, ranking strategies, and decision policies are built. This phase creates the decision engine configuration that all delivery options (A, B, C) share.
Decision: Placement channel and content format
Determine where offers will appear and in what format.
Decision: Ranking strategy
Determine how the best offer should be selected from eligible offers.
Decision: Offer capping
Determine whether there should be limits on how many times an offer is shown.
UI navigation: Components > Decision Management > Placements / Rules / Offers / Decisions
Key configuration details
-
Create placements – Define where offers appear by specifying the channel type and content format for each placement.
- UI: Components > Decision Management > Placements
- Create one placement per channel/format combination (e.g., “Email Hero Banner - HTML,” “Web Homepage - JSON,” “Mobile App Card - JSON”)
-
Define eligibility rules – Create rules using segment rule expressions that reference profile attributes or audience membership.
- UI: Components > Decision Management > Rules
- Rules can reference audience membership, profile attributes (loyalty tier, subscription type), date constraints, or contextual data
-
Create personalized offers – Build each offer with content representations for every placement, assign eligibility rules, set priority, and configure optional capping.
- UI: Components > Decision Management > Offers > Create offer
- Each offer needs a content representation per placement (e.g., HTML for email, JSON for web)
- Assign eligibility rules to control which profiles can see each offer
- Set offer validity dates (start/end) and optional frequency capping
- Approve each offer to make it eligible for decisioning
-
Create fallback offers – Build a default offer for each placement that is shown when no personalized offer qualifies.
- UI: Components > Decision Management > Offers > Create fallback offer
- Fallback must have representations for every placement used in the decision
-
Create collection qualifiers and collections – Organize offers into collections using qualifier tags.
- UI: Components > Decision Management > Collection qualifiers
- Group related offers (e.g., “Summer Promotions,” “Loyalty Rewards”) for use in decision scopes
-
Create decision policies – Bind placements, collections, ranking strategies, and fallback offers into executable decisions.
- UI: Components > Decision Management > Decisions > Create decision
- Each decision scope links a placement to a collection and specifies the ranking method
Experience League documentation
Phase 4: Configure channel and surface
Application function: AJO: Channel Configuration
This phase configures the channel surfaces through which offers will be delivered. The configuration depends on which implementation option(s) are being used.
Decision: Channel type
Determine which messaging channel the use case requires.
Where options diverge
For Option A (Email Decisioning):
- UI: Administration > Channels > Channel surfaces > Create surface (Email)
- Configure subdomain, IP pool, sender name/email, reply-to, unsubscribe settings
- Verify SPF, DKIM, and DMARC records for the sending subdomain
For Option B (Web/App Real-Time):
- UI: Administration > Channels > Channel surfaces > Create surface (Web or In-app)
- For web: Configure the web surface URL pattern
- For code-based experiences: Define the surface URI for the application
- Verify the datastream has AJO service enabled
For Option C (Journey Decision Node):
- Configure channel surfaces for each channel used in the journey (email, push, SMS, or web)
- Each journey message action requires a corresponding active channel surface
Experience League documentation
Phase 5: Configure content and delivery
Application function: AJO: Message Authoring, AJO: Campaign Execution
This phase designs the message templates or experience surfaces that display the selected offer, then configures the delivery mechanism (campaign, journey, or code-based experience).
Decision: Content approach for offer rendering
Determine how the offer content should be integrated into the message or experience.
Decision: Campaign type (Option A only)
Determine whether this is a scheduled marketing campaign or an API-triggered campaign.
Where options diverge
For Option A (Email Decisioning):
-
Author the email message using the Email Designer
- UI: Campaigns > Create Campaign > Select Email > Edit content
- Insert an offer decision component into the email layout to define the placement zone
- Add personalization tokens for profile-level content (name, loyalty tier)
- Configure subject line and preheader with optional personalization
-
Create and configure the campaign
- UI: Campaigns > Create Campaign > Scheduled or API-triggered
- Bind the target audience and select the channel surface
- Set the execution schedule or API trigger configuration
- Review and activate the campaign
For Option B (Web/App Real-Time):
-
Configure the code-based experience or web channel
- UI: Campaigns > Create Campaign > Code-based experience (or Web)
- Link the decision policy to the experience surface
- Define the rendering format (JSON response for code-based; visual editor for web channel)
-
Implement client-side rendering
- Use the Web SDK
sendEventresponse to retrieve the selected offer - Render the offer content in the designated placement on the page
- Implement impression and click tracking
- Use the Web SDK
For Option C (Journey Decision Node):
-
Design the journey with a decision node
- UI: Journeys > Create Journey > Add decision node
- Configure the decision node to invoke the decision policy from Phase 3
-
Add message action nodes after the decision
- Configure email, push, or SMS actions that reference the selected offer
- Add wait steps, conditions, or branching based on offer engagement
-
Publish the journey
Experience League documentation
Phase 6: Test and validate
Application function: AJO: Decisioning, AJO: Message Authoring
This phase validates that the decisioning engine returns the correct offers for test profiles and that the offer content renders properly in each delivery channel.
Test decisioning logic
Use test profiles with known attributes to verify that the correct offers are selected based on eligibility and ranking.
- Create test profiles that match different eligibility criteria (e.g., Gold tier, Silver tier, Trial user)
- Verify each test profile receives the expected offer
- Verify that profiles matching no eligibility rules receive the fallback offer
Test content rendering
Preview the offer content in each delivery channel.
- For Option A: Use email preview with test profiles to verify offer content renders correctly
- For Option B: Test the Edge Decisioning response in a staging environment
- For Option C: Use journey test mode to verify the decision node selects correctly
Validate impression tracking
Confirm that offer impressions, clicks, and conversions are being tracked.
- Verify offer interaction events appear in the tracking datasets
- Confirm attribution between offer impressions and downstream conversions
Experience League documentation
Phase 7: Configure reporting and performance monitoring
Application function: AJO: Reporting & Performance Analysis
This phase sets up reporting to track offer selection distribution, acceptance rates, conversion impact, and fallback rates. This phase covers both AJO native reports and CJA-based cross-channel analysis.
Decision: Reporting method
Determine which reporting tools are needed for offer performance analysis.
Key configuration details
-
AJO native reporting – Monitor campaign or journey performance using built-in reports.
- UI: Campaigns > Select campaign > All time report (or Live report)
- Review offer-specific metrics: impressions per offer, click-through rate per offer, fallback rate
- Monitor delivery funnel: Targeted > Sent > Delivered > Opens > Clicks
-
CJA analysis (recommended) – Build cross-channel offer performance dashboards.
- Configure a CJA connection including AJO offer interaction datasets
- Create a data view with offer-specific dimensions (offer name, placement, decision) and metrics (impressions, clicks, conversions)
- Build workspace analysis for: offer selection distribution, acceptance rate by segment, revenue impact, cross-channel offer consistency
Experience League documentation
Implementation considerations
This section covers guardrails, common pitfalls, best practices, and trade-off decisions for offer decisioning implementations.
Guardrails and limits
Be aware of the following platform guardrails and limits when planning your implementation.
- Maximum of 10,000 approved personalized offers per sandbox – Decision Management guardrails
- Maximum of 30 placements per decision
- Maximum of 30 collection scopes per decision request
- AI ranking models require a minimum of 1,000 conversion events for training
- Offer capping counters may have a lag of up to a few seconds in high-throughput scenarios
- Edge decisions are limited to profile attributes available in the edge profile store
- Maximum of 4,000 segment definitions per sandbox – Platform guardrails
- Only one merge policy can be active on Edge per sandbox
- Maximum of 500 active live campaigns per sandbox
- Journey entry rate limit: 5,000 profiles per second
- Maximum of 10 channel surfaces per channel type per sandbox
Common pitfalls
Avoid these frequently encountered issues during implementation.
- Decision always returns fallback offer: This typically means personalized offers are not approved, are outside their validity date range, or eligibility rules do not match the test profile’s attributes. Verify each condition: approval status, date range, and segment rule expression accuracy. Also check that capping limits have not been reached.
- Offer not appearing in collection: Ensure the offer has been tagged with the correct collection qualifier and that the collection filter matches. Offers must be both tagged and approved to appear in collection-based decision scopes.
- Ranking formula not applied: Verify that the formula is syntactically valid and references accessible profile attributes. Formula errors silently fall back to priority-based ranking with no visible error.
- Edge delivery returns empty personalization: Ensure the datastream is configured with the Adobe Journey Optimizer service enabled and that the decision scope is correctly formatted. Verify the edge-active merge policy exists.
- Inconsistent offers across channels: If separate decision policies are used per channel, the same profile may receive different offers. Use a single decision policy across channels for consistency, or accept intentional divergence based on channel-specific placements.
- Offer content not rendering in email: Verify the offer has a content representation that matches the email placement format (HTML or image URL). Missing representations result in blank placement zones.
Best practices
Follow these recommendations for a successful offer decisioning implementation.
- Start with a small offer catalog and iterate – Begin with 5-10 offers and expand as the decisioning framework is validated. This simplifies troubleshooting and ensures eligibility rules work correctly before scaling.
- Use collection qualifiers strategically – Tag offers by category (e.g., “Acquisition,” “Retention,” “Upsell”) to enable flexible collection-based decision scopes that can be reused across campaigns and journeys.
- Always create meaningful fallback offers – Fallback offers are not just a safety net; they are the default experience for profiles that do not match any eligibility rule. Invest in fallback content that provides value even without personalization.
- Design eligibility rules to be mutually exclusive where possible – When multiple offers have overlapping eligibility, the ranking strategy becomes critical. If business requirements dictate a specific offer for a specific segment, make eligibility rules mutually exclusive rather than relying solely on ranking.
- Test with edge-representative profiles for Option B – Edge profiles contain a subset of hub profile attributes. Test with profiles that have edge-available attributes to ensure eligibility evaluates correctly in production.
- Monitor fallback rates as a health metric – A high fallback rate (above 20-30%) indicates that the offer catalog does not cover enough customer segments. Expand the offer catalog or broaden eligibility rules.
- Version decision policies rather than editing live ones – Create a new decision policy version rather than modifying an active one. This prevents disruption to live campaigns and enables A/B comparison of decision strategies.
Trade-off decisions
Consider the following trade-offs when making architectural and configuration decisions.
Eligibility precision vs. offer coverage
Tight eligibility rules ensure each offer reaches only the most relevant profiles, but may result in high fallback rates when profiles do not match any offer. Broad eligibility rules maximize offer coverage but reduce personalization precision.
- Tight eligibility favors: Higher acceptance rates, better personalization, lower offer fatigue
- Broad eligibility favors: Lower fallback rates, more profiles receive personalized offers, simpler rule management
- Recommendation: Start with broader eligibility rules and tighten them based on performance data. Monitor fallback rates and acceptance rates to find the right balance. Use ranking strategies to differentiate among broadly eligible offers.
Priority-based vs. AI-ranked ranking
Priority-based ranking gives the business full control over which offers are shown, while AI-ranked ranking optimizes for conversion but reduces human control over offer selection.
- Priority-based favors: Business control, predictability, no training data requirement, immediate deployment
- AI-ranked favors: Conversion optimization, discovery of unexpected patterns, automatic adaptation to changing customer behavior
- Recommendation: Use priority-based ranking for initial launches and regulatory-sensitive offers where business control is paramount. Transition to AI-ranked for high-volume, performance-optimized use cases once sufficient conversion data (1,000+ events) is available.
Single decision policy vs. per-channel decision policies
A single decision policy ensures offer consistency across all channels but constrains per-channel optimization. Per-channel policies allow channel-specific ranking and eligibility but risk inconsistent customer experiences.
- Single policy favors: Cross-channel consistency, simpler management, unified reporting
- Per-channel policies favor: Channel-optimized ranking, channel-specific eligibility (e.g., web-only offers), independent iteration
- Recommendation: Start with a single decision policy for cross-channel consistency. Create per-channel policies only when business requirements demand channel-specific offer strategies (e.g., web-exclusive flash sales).
Hub decisioning (Option A/C) vs. edge decisioning (Option B)
Hub decisioning has access to the full profile but operates at send time. Edge decisioning operates in real time with sub-second latency but is limited to edge-available profile attributes.
- Hub decisioning favors: Access to full profile data, complex eligibility rules, batch campaign volumes
- Edge decisioning favors: Real-time context, in-session personalization, sub-second response
- Recommendation: Use hub decisioning for outbound channels (email, push) where full profile data improves offer relevance. Use edge decisioning for inbound channels (web, app) where real-time response is critical. Ensure eligibility rules for edge use only edge-available attributes.
Related documentation
The following resources provide additional detail on the components used in this use case pattern.