Plan your data model
- Topics:
- Schemas
CREATED FOR:
- Beginner
- Developer
This video reviews what to do before you start building your schemas in Adobe Experience Platform. Document your business use cases, understand your Platform license, know the product guardrails, and identify what data to ingest before finalizing your data model. For more information, please visit the schemas documentation.
Transcript
In this video, I want to talk about things you should do before you start building your schemas. Schemas are a critical foundation of a platform implementation. And so, you’ll want to build them with consideration. I’ll cover documenting your business use cases. Knowing what you’ve licensed. Knowing constraints of your license and the product. Identifying what data to ingest and creating a platform-centric entity relationship diagram or ERD. The last topic is pretty large and I’m going to save that for another video. Before you start building your data models, it’s imperative that you work with your business stakeholders to understand and document their key business use cases. I can’t emphasize this enough like with all Adobe digital experience products, your business use cases should be the driver of the technical implementation. You can document these in whatever format or system works for your company, but do document them. Some of these can definitely impact the design decisions you make as the data architect. So, document them and then review them again with your stakeholders so there aren’t any surprises later on when the implementation is up and running. The good news is, these scores should have already been fleshed out during the sales process. And it informed which parts of platform were purchased including any app services and other applications. One resource we use in both the sales and delivery process are digital experience blueprints. Blueprints are repeatable implementations that can help you solve established business problems. There are available on experience league and each one explains the use cases it addresses, lists the products involve, contains architecture diagrams and links to relevant enablement content. So, find out which blueprints your stakeholders are expecting to use. Your company segmentation goals are one of the most important topics to discuss with your stakeholders before defining your data models. Let me illustrate why this is important to do upfront. As you probably know, there are two types of data and platform, record and time series. Record data is for the current attributes of a customer. Time series is for the customer’s actions. Loyalty system data is typically modeled as record data. A customer might have 30,000 loyalty points. If you built a segment for customers with 30,000 points, they would be in it. But what if your marketers wanted to be able to personalize content based on loyalty point transactions, say customers who earned 10,000 loyalty points in the last month or spend 100,000 points in the last year. You might be able to patch your record schema with some additional fields in the pinch but that’s starting to sound like that data should have been modeled as time series data. The modeling decisions you make impact every downstream user or platform. So, it’s important to know what those users expect to be able to do.
Be sure to know which platform package or packages you purchased, which application or intelligence services were purchased and which other Adobe applications or third-party applications you intend to use. For example, while many platform packages in use cases include real-time customer profile, some don’t, there are modeling considerations that must be taken into account for real-time customer profile, which may or may not apply to you. Specific field groups are required for intelligence services like Customer AI and Attribution AI which are also critical to understand at the outset. Adobe applications and services that send data to platform typically have their own specific schemas that are created in your account when they’re provisioned. As the data architect, you should be aware of those.
In addition to which platform features you have access to, your contract might call out some key parameters that could impact what data you ingest into platform and how you use it. For example, say your contract limits you to 10 million profiles. You wouldn’t want to ingest data from a system that would automatically create 20 million profiles. It becomes even more important to understand these limitations as you move beyond your primary use cases in data sources and consider expanding your use of Experience Platform.
Also, platform has some guard rails, this change. So, I don’t want to go into a lot of detail but there is some documentation of these at the moment covering things like, how many data sets you can use with real-time customer profiles. How many relationships between schemas we recommend, the number of segments in a sandbox. Some of these are hard limits. While others are recommendations to keep the system performing quickly. Review this documentation before and while you’re designing your data model. The last important thing I want to cover in this video since we’ll save that entity relationship diagram for a separate one, is to understand your data sources. What data should be brought into platform and why? Are you ingesting data into the lake to run machine learning models and use Customer Journey Analytics? Do you need data in real-time customer profile for marketing activation? Maybe you’re doing both. You should be more selective about which data you ingest into profile because of the license and performance considerations I mentioned earlier. Focus first on your primary data sources needed to address your key use cases, but also start thinking about how you would model data from secondary sources so your model is ready to scale in the future. So those are some things to think about before you start building your schemas, good luck. -
Experience Platform
- Platform Tutorials
- Introduction to Platform
- A customer experience powered by Experience Platform
- Behind the scenes: A customer experience powered by Experience Platform
- Experience Platform overview
- Key capabilities
- Platform-based applications
- Integrations with Experience Cloud applications
- Key use cases
- Basic architecture
- User interface
- Roles and project phases
- Introduction to Real-Time CDP
- Getting started: Data Architects and Data Engineers
- Import sample data to Experience Platform
- Administration
- AI Assistant
- Overview
- Agent Orchestrator
- Agent Orchestrator interface
- Get access
- Audience Agent
- Journey Agent
- Experimentation Agent
- Data Insights Agent
- Product Support Agent
- Onboard with a new product
- Learn about products
- Validate responses
- Discoverability panel
- Find unused audiences
- Operational insights
- Impact analysis
- Security overview
- APIs
- Audiences and Segmentation
- Audience Builder
- Introduction
- Upload audiences
- Audience rule builder overview
- Create audiences
- Use time constraints
- Create content-based audiences
- Create conversion audiences
- Create audiences from existing audiences
- Create sequential audiences
- Create dynamic audiences
- Create multi-entity audiences
- Create and activate account audiences (B2B)
- Demo of streaming segmentation
- Evaluate batch audiences on demand
- Federated Audience Composition
- Segment Match
- Tutorials
- Audience Builder
- Audit logs
- Data Collection
- Collaboration
- Real-Time CDP Collaboration Overview
- Intro to Collaboration
- Real-Time CDP Overview for Agency Practitioners
- Collaboration - Process and People
- Set permissions
- Set up an Advertiser account
- Reference audiences as an advertiser
- Connect with publishers
- Create a project
- Discover audience overlaps
- Activate audiences to collaborators
- Brand to Brand
- Dashboards
- Data Governance
- Data Hygiene
- Data Ingestion
- Overview
- Batch ingestion overview
- Create and populate a dataset
- Delete datasets and batches
- Map a CSV file to XDM
- Sources overview
- Ingest data from Adobe Analytics
- Ingest data from Audience Manager
- Ingest data from cloud storage
- Ingest data from CRM
- Ingest data from databases
- Streaming ingestion overview
- Stream data with HTTP API
- Stream data using Source Connectors
- Web SDK tutorials
- Mobile SDK tutorials
- Data Lifecycle
- Destinations
- Destinations overview
- Connect to destinations
- Create destinations and activate data
- Activate profiles and audiences to a destination
- Export datasets using a cloud storage destination
- Integrate with Google Customer Match
- Configure the Azure Blob destination
- Configure the Marketo Engage destination
- Configure file-based cloud storage or email marketing destinations
- Configure a social destination
- Activate through LiveRamp destinations
- Adobe Target and Custom Personalization
- Activate data to non-Adobe applications webinar
- Identities
- Intelligent Services
- Monitoring
- Partner data support
- Profiles
- Understanding the Real-Time Customer Profile
- Profile overview diagram
- Bring data into Profile
- Customize profile view details
- View account profiles
- Create merge policies
- Union schemas overview
- Create a computed attribute
- Pseudonymous profile expirations (TTL)
- Delete profiles
- Update a specific attribute using upsert
- Privacy and Security
- Introduction to Privacy Service
- Identity data in Privacy requests
- Privacy JavaScript library
- Privacy labels in Adobe Analytics
- Getting started with the Privacy Service API
- Privacy Service UI
- Privacy Service API
- Subscribe to Privacy Events
- Set up customer-managed keys
- 10 considerations for Responsible Customer Data Management
- Elevating the Marketer’s Role as a Data Steward
- Queries and Data Distiller
- Schemas
- Overview
- Building blocks
- Plan your data model
- Convert your data model to XDM
- Create schemas
- Create schemas for B2B data
- Create classes
- Create field groups
- Create data types
- Configure relationships between schemas
- Use enumerated fields and suggested values
- Copy schemas between sandboxes
- Update schemas
- Create an ad hoc schema
- Sources
- Use Case Playbooks
- Experience Cloud Integrations
- Industry Trends