Streaming data ingestion overview
- Topics:
- Data Ingestion
CREATED FOR:
- Beginner
- Developer
Using Experience Platform’s streaming ingestion you can be sure that any data you send will be available in the Real-Time Customer Profile. This data can be captured from CRM and ERP systems, or from any other source which is able to communicate over HTTP or public cloud streaming infrastructure. You can then use this data for real-time segmentation and to activate other marketing processes. For more information, please visit the Streaming Ingestion documentation.
Transcript
In this video, I’m going to give you a quick introduction to streaming ingestion with Adobe Experience platform. At Adobe, we believe people by experiences, not products. And to deliver powerful and compelling experiences, you need up to date information so you can make meaningful decisions in milliseconds. So what does this mean? It means providing the fastest path to update a real time customer profile. It means being able to convert hundreds of thousands of signals into actionable insights as quickly as possible. Using experience platforms, streaming, ingestion. You can be sure that any data you send will be available in the real time customer profile. This data can be captured on your website or mobile apps, CRM and ERP systems, or from any other source which is able to communicate over Http or public cloud streaming infrastructure. Data sent through streaming to experienced platform is stitched together with existing data and the real time customer profile in real time. You can then use this data for real time segmentation and to activate other marketing processes with the Adobe Experience Platform extension in the launch interface. You can quickly configure your websites to send data directly to experience Platform. If you’re sending personally identifiable information or other sensitive data to platform, you can configure data collection to be secure and prevent data from untrusted sources.
To help make debugging easier. Streaming. Ingestion provides two modes of validation. Synchronous and asynchronous. In synchronous mode, streaming ingestion will drop records which fail zdm validation and include the reason for the failure. These failed records will not be sent downstream. Synchronous validation provides immediate feedback to help you maintain velocity as you figure out how to work with Xdm formatted Json payloads.
An asynchronous mode where data loss must be prevented. Streaming validation will detect and move bad data to a separate location in the Experience Platform. Data Lake.
Once data starts flowing, you can monitor it in the platform interface. Experience platform lets you observe incoming data all the way from collection to consumption in the data lake and real time customer profile. All streaming ingested data lands in the Experience Platform Data Lake, and is available with a maximum latency of 15 minutes.
In parallel, the ingested data builds out identity graphs and the real time customer profile. To learn more about streaming ingestion, head on over to the Adobe Experience Platform documentation. I hope you enjoyed this overview of streaming, ingestion and Adobe Experience platform.
Experience Platform
- Platform Tutorials
- Introduction to Platform
- A customer experience powered by Experience Platform
- Behind the scenes: A customer experience powered by Experience Platform
- Experience Platform overview
- Key capabilities
- Platform-based applications
- Integrations with Experience Cloud applications
- Key use cases
- Basic architecture
- User interface
- Roles and project phases
- Introduction to Real-Time CDP
- Getting started: Data Architects and Data Engineers
- Authenticate to Experience Platform APIs
- Import sample data to Experience Platform
- Administration
- AI Assistant
- Audiences and Segmentation
- Introduction to Audience Portal and Composition
- Upload audiences
- Overview of Federated Audience Composition
- Connect and configure Federated Audience Composition
- Create a Federated Audience Composition
- Audience rule builder overview
- Create audiences
- Use time constraints
- Create content-based audiences
- Create conversion audiences
- Create audiences from existing audiences
- Create sequential audiences
- Create dynamic audiences
- Create multi-entity audiences
- Create and activate account audiences (B2B)
- Demo of streaming segmentation
- Evaluate batch audiences on demand
- Evaluate an audience rule
- Create a dataset to export data
- Segment Match connection setup
- Segment Match data governance
- Segment Match configuration flow
- Segment Match pre-share insights
- Segment Match receiving data
- Audit logs
- Data Collection
- Collaboration
- Dashboards
- Data Governance
- Data Hygiene
- Data Ingestion
- Overview
- Batch ingestion overview
- Create and populate a dataset
- Delete datasets and batches
- Map a CSV file to XDM
- Sources overview
- Ingest data from Adobe Analytics
- Ingest data from Audience Manager
- Ingest data from cloud storage
- Ingest data from CRM
- Ingest data from databases
- Streaming ingestion overview
- Stream data with HTTP API
- Stream data using Source Connectors
- Web SDK tutorials
- Mobile SDK tutorials
- Data Lifecycle
- Destinations
- Destinations overview
- Connect to destinations
- Create destinations and activate data
- Activate profiles and audiences to a destination
- Export datasets using a cloud storage destination
- Integrate with Google Customer Match
- Configure the Azure Blob destination
- Configure the Marketo destination
- Configure file-based cloud storage or email marketing destinations
- Configure a social destination
- Activate through LiveRamp destinations
- Adobe Target and Custom Personalization
- Activate data to non-Adobe applications webinar
- Identities
- Intelligent Services
- Monitoring
- Partner data support
- Profiles
- Understanding Real-Time Customer Profile
- Profile overview diagram
- Bring data into Profile
- Customize profile view details
- View account profiles
- Create merge policies
- Union schemas overview
- Create a computed attribute
- Pseudonymous profile expirations (TTL)
- Delete profiles
- Update a specific attribute using upsert
- Privacy and Security
- Introduction to Privacy Service
- Identity data in Privacy requests
- Privacy JavaScript library
- Privacy labels in Adobe Analytics
- Getting started with the Privacy Service API
- Privacy Service UI
- Privacy Service API
- Subscribe to Privacy Events
- Set up customer-managed keys
- 10 considerations for Responsible Customer Data Management
- Elevating the Marketer’s Role as a Data Steward
- Queries and Data Distiller
- Overview
- Query Service UI
- Query Service API
- Explore Data
- Prepare Data
- Adobe Defined Functions
- Data usage patterns
- Run queries
- Generate datasets from query results
- Tableau
- Analyze and visualize data
- Build dashboards using BI tools
- Recharge your customer data
- Connect clients to Query Service
- Validate data in the datalake
- Schemas
- Overview
- Building blocks
- Plan your data model
- Convert your data model to XDM
- Create schemas
- Create schemas for B2B data
- Create classes
- Create field groups
- Create data types
- Configure relationships between schemas
- Use enumerated fields and suggested values
- Copy schemas between sandboxes
- Update schemas
- Create an ad hoc schema
- Sources
- Use Case Playbooks
- Experience Cloud Integrations
- Industry Trends