Basic architecture of Adobe Experience Platform
Last update: February 14, 2025
CREATED FOR:
- Beginner
- Leader
- Developer
- Admin
- User
Learn the high-level architecture of Adobe Experience Platform from a guided walkthrough of an overview diagram.

Transcript
Hi, this is Nick Hecht, Solution Architect on the Adobe Experience Platform team. I’m going to walk through an overview diagram that we’ve created that illustrates the primary components of the Adobe Experience Platform. In this diagram, we can see that there are many different options for ingesting data into Experience Platform. And depending on whether we choose to stream or batch data in, the path of data ingestion changes. One way to stream data is by using the Adobe Experience Platform Web SDK, a client-side JavaScript library to interact with the various services in the Experience Cloud through the Adobe Experience Platform Edge Network. The Edge Network is used to facilitate real-time experiences. We can also batch data in through our Batch API or by landing batch files in a file storage location, such as FTP or a cloud storage location. We also provide our own Adobe-provisioned data landing zone for customers to upload files for ingestion into Experience Platform. For certain enterprise sources and applications, we’ve also created connectors to directly connect to and ingest from these sources. And this will be an ever-growing catalog of connectors that we develop that will be available in the Experience Platform user interface. We also have native connectors with our Adobe applications, such as Adobe Analytics, Audience Manager, and others that can feed their data directly into platform as well. Once this data is ingested into platform, a few things happen. First is that the data is placed into the data lake in its raw form. And also, any data that is streaming is placed in the Experience Platform pipeline for consumption by other services as fast as possible. For example, to run real-time segmentation on the Edge Network or to forward streaming events to partners using event forwarding. Any data that we can onboard from the various services is going to be stored in the data lake as datasets comprised of batches and files that can be accessed by various platform components. The second thing that happens is that any data we have configured to process into the real-time customer profile gets flagged for immediate processing into the identity graph and into the profile store. This happens in a streaming fashion for any data streamed into platform and in a batch fashion for any data that lands in the data lake as soon as the batch is fully uploaded. We have other services that are native to platform as well, such as access controls. Access controls allow us to set specific permission rights and limit capabilities for specific users. Additionally, we have data governance controls to ensure the proper use and governance of the data for various use cases. We also have the Experience Platform data model that provides a common taxonomy, or in other words, a common data model for any data that is stored in the Experience Platform. This data model can be extended as well to account for any custom data attributes that are unique to your business or your data model. The benefit of aligning to a common data model is it helps provide a consistent definition on the data. This makes it easier to stitch the data into unified profiles and enables us to quickly understand and leverage the data in the Experience Platform. Moving on to some other components, we have our data insights capabilities. Query service and intelligent services. Query service allows SQL query access on the data lake so SQL queries can be built and the data can be analyzed and queried in its Rost form. This capability includes a connector so that external applications such as business intelligence tools and other SQL access tools can directly connect to this query service and the data in the data lake can be accessed and visualized through these tools. Intelligent services offers out-of-the-box AI services that help with predicting customer propensity, customer journey impact, content intelligence and more. Because intelligent services are built on Experience Platform, the insights generated are immediately stored in the data lake and real-time customer profile, allowing any of your downstream services to take action on the insights. As part of Platform, we also have segmentation capabilities that operate against the real-time customer profile. This includes batch segmentation to segment profiles in bulk to, for example, power scheduled campaigns or build audiences of customers. And of course, there’s also real-time and streaming segmentation. Streaming segmentation works by evaluating data as it streams into the system and into profiles, while real-time segmentation works on the edge to power real-time experiences such as web and mobile personalization. Once the rules for the segment are met, the customer profile immediately becomes a member of that segment and notifications can be sent and provided to downstream systems for actioning. This information is stored in the real-time customer profile as a segment membership so that other applications and destinations also have access to the membership. We also have audience composition capabilities which allow us to split and test audiences on-board external audiences and manage all of our audiences in one central location. For the business users, such as analysts and marketers and advertisers, we have several applications that sit on top of the Experience Platform, not to mention an ever-growing array of integrations within the Experience Cloud. These include Adobe Campaign, Journey Optimizer, Customer Journey Analytics, Target, and others, which are covered in more depth in application-specific videos and guides. For this video, we will focus primarily on the three application services that are built on top of the platform and which are shown here. The first application service is Customer Journey Analytics, which enables cross-channel Customer Journey Analysis. So any data that exists in the data lake, including stitched-up profiles or other datasets that have been onboarded from various sources, can be ingested and stitched. We can then visualize, filter, and explore this data to discover and uncover insights related to the end-to-end Customer Journey. The second application is the real-time Customer Data Platform that is built leveraging real-time customer profile and platform segmentation capabilities. Real-time CDP allows activating customer data and audiences to destinations. For example, audiences can be shared to email systems, social ad networks, and other applications. The real-time Customer Data Platform allows for activation of not only anonymous profiles, but also direct known customer profiles as well. The third application service built on Adobe Experience Platform is Adobe Journey Optimizer. This application enables us to deliver customer experiences across channels at any point of the customer journey. Journeys can include card abandon campaigns, registration confirmations, or even location-based mobile messages. The profile events streaming into platform trigger these journeys and initiate customer interactions and outbound messages through Journey Optimizer. So these are examples of some current services and applications that are built on top of platform. However, platform can be integrated with other partner and customer applications through Experience Platform APIs as well. Some customers, for example, are integrating their own customer support application directly with real-time customer profile to provide support agents with a real-time view of customer context, such as which campaigns and audiences that customer is a member of, or what may be the best product recommendations or offers to provide that customer given the rich context that is available. And we will continue to build out other native integrations with a number of partner applications in the ecosystem to enhance the data and insights that are in the customer profile as well as allow the leveraging of customer profile in those applications. So with this, you should now have a good understanding of the basic platform components in architecture. We hope this architecture overview was helpful and provided a glimpse of what Experience Platform is capable of and how it can help you and your business build effective and engaging customer experiences.
Previous pageKey use cases
Next pageUser interface
Experience Platform
- Platform Tutorials
- Introduction to Platform
- A customer experience powered by Experience Platform
- Behind the scenes: A customer experience powered by Experience Platform
- Experience Platform overview
- Key capabilities
- Platform-based applications
- Integrations with Experience Cloud applications
- Key use cases
- Basic architecture
- User interface
- Roles and project phases
- Introduction to Real-Time CDP
- Getting started: Data Architects and Data Engineers
- Authenticate to Experience Platform APIs
- Import sample data to Experience Platform
- Administration
- AI Assistant
- Audiences and Segmentation
- Introduction to Audience Portal and Composition
- Upload audiences
- Overview of Federated Audience Composition
- Connect and configure Federated Audience Composition
- Create a Federated Audience Composition
- Audience rule builder overview
- Create audiences
- Use time constraints
- Create content-based audiences
- Create conversion audiences
- Create audiences from existing audiences
- Create sequential audiences
- Create dynamic audiences
- Create multi-entity audiences
- Create and activate account audiences (B2B)
- Demo of streaming segmentation
- Evaluate an audience rule
- Create a dataset to export data
- Segment Match connection setup
- Segment Match data governance
- Segment Match configuration flow
- Segment Match pre-share insights
- Segment Match receiving data
- Audit logs
- Data Collection
- Collaboration
- Dashboards
- Data Governance
- Data Hygiene
- Data Ingestion
- Overview
- Batch ingestion overview
- Create and populate a dataset
- Delete datasets and batches
- Map a CSV file to XDM
- Sources overview
- Ingest data from Adobe Analytics
- Ingest data from Audience Manager
- Ingest data from cloud storage
- Ingest data from CRM
- Ingest data from databases
- Streaming ingestion overview
- Stream data with HTTP API
- Stream data using Source Connectors
- Web SDK tutorials
- Mobile SDK tutorials
- Data Lifecycle
- Destinations
- Destinations overview
- Connect to destinations
- Create destinations and activate data
- Activate profiles and audiences to a destination
- Configure a dataset export destination
- Integrate with Google Customer Match
- Configure the Azure Blob destination
- Configure the Marketo destination
- Configure file-based cloud storage or email marketing destinations
- Configure a social destination
- Activate through LiveRamp destinations
- Adobe Target and Custom Personalization
- Activate data to non-Adobe applications webinar
- Identities
- Intelligent Services
- Monitoring
- Partner data support
- Profiles
- Understanding Real-Time Customer Profile
- Profile overview diagram
- Bring data into Profile
- Customize profile view details
- View account profiles
- Create merge policies
- Union schemas overview
- Create a computed attribute
- Pseudonymous profile expirations (TTL)
- Delete profiles
- Update a specific attribute using upsert
- Privacy and Security
- Introduction to Privacy Service
- Identity data in Privacy requests
- Privacy JavaScript library
- Privacy labels in Adobe Analytics
- Getting started with the Privacy Service API
- Privacy Service UI
- Privacy Service API
- Subscribe to Privacy Events
- Set up customer-managed keys
- 10 considerations for Responsible Customer Data Management
- Elevating the Marketer’s Role as a Data Steward
- Queries
- Overview
- Query Service UI
- Query Service API
- Explore Data
- Prepare Data
- Adobe Defined Functions
- Data usage patterns
- Run queries
- Generate datasets from query results
- Tableau
- Analyze and visualize data
- Build dashboards using BI tools
- Recharge your customer data
- Connect clients to Query Service
- Validate data in the datalake
- Schemas
- Overview
- Building blocks
- Plan your data model
- Convert your data model to XDM
- Create schemas
- Create schemas for B2B data
- Create classes
- Create field groups
- Create data types
- Configure relationships between schemas
- Use enumerated fields and suggested values
- Copy schemas between sandboxes
- Update schemas
- Create an ad hoc schema
- Sources
- Use Case Playbooks
- Experience Cloud Integrations
- Industry Trends