Adobe Data Collection
- Topics:
- Sources
CREATED FOR:
- Developer
Adobe Experience Platform provides a suite of technologies that allow you to collect customer experience data from client-side sources, and send it to the Adobe Experience Platform Edge Network where it can be enriched, transformed, and distributed to Adobe or non-Adobe destinations in seconds.
The sources integration of Adobe Experience Platform Data Collection allows you to access your data on the Edge Network, including Data Prep for Data Collection, as well as improved support for warnings, through the sources catalog.
Use the sources workspace to access Data Collection
In the Platform UI, select Sources from the left navigation bar to access the Sources workspace. The Catalog screen displays a variety of sources with which you can create an account.
You can select the appropriate category from the catalog on the left-hand side of your screen. Alternatively, you can find the specific source you wish to work with using the search option.
Under the Adobe applications category, select Adobe Data Collection, and then select Set up.
The Data Collection UI appears on the Datastreams tab.
A datastream is a configuration that tells the Edge Network where you want your data to be sent. Specifically, a datastream specifies which Experience Cloud products you want to send the data to, and how you want the data to be handled and stored in each product.
For comprehensive steps on how to configure data collection in the UI, see the data collection end-to-end overview.
Next steps
By reading this document, you have learned how to access the Data Collection UI using the sources workspace. For more information on Data Collection, see the Data Collection overview.
Experience Platform
- Sources overview
- Available source connectors
- Adobe applications
- Advertising
- Analytics
- Cloud storage
- Amazon Kinesis connector
- Amazon S3 connector
- Apache HDFS connector
- Azure Data Lake Storage Gen2 connector
- Azure Blob connector
- Azure Event Hubs connector
- Azure File Storage connector
- Data Landing Zone
- FTP connector
- Google Cloud Storage connector
- Google PubSub
- Oracle Object Storage
- SFTP connector
- Amazon S3 and Azure Blob connector
- Consent & Preferences
- CRM
- Customer success
- Databases
- Amazon Redshift connector
- Apache Hive on Azure HDInsights connector
- Apache Spark on Azure HDInsights connector
- Azure Data Explorer connector
- Azure Synapse Analytics connector
- Azure Table Storage connector
- Couchbase connector
- Google BigQuery connector
- GreenPlum connector
- HP Vertica connector
- IBM DB2 connector
- MariaDB connector
- Microsoft SQL Server connector
- MySQL connector
- Oracle connector
- Phoenix connector
- PostgreSQL connector
- Snowflake Streaming connector
- Snowflake connector
- Teradata Vantage connector
- Data & identity partner
- eCommerce
- Local system
- Marketing automation
- Payments
- Protocols
- Streaming
- API tutorials
- Create a base connection
- Explore data
- Collect data
- On-demand ingestion
- Filter data at the source level
- Monitor dataflows
- Update accounts
- Update dataflows
- Retry failed dataflow runs
- Delete accounts
- Delete dataflows
- Ingest encrypted data
- Save a dataflow as a draft
- Apply access labels to a dataflow
- UI tutorials
- Create a source connection
- Adobe applications
- Advertising
- Analytics
- Cloud storage
- Consent & Preferences
- CRM
- Customer Success
- Databases
- Amazon Redshift
- Apache Hive on Azure HDInsights
- Apache Spark on Azure HDInsights
- Azure Data Explorer
- Azure Synapse Analytics
- Azure Table Storage
- Couchbase
- Google Big Query
- GreenPlum
- HP Vertica
- IBM DB2
- MariaDB
- Microsoft SQL Server
- MySQL
- Oracle
- Phoenix
- PostgreSQL
- Snowflake
- Snowflake Streaming
- Teradata Vantage
- Data & identity partner
- eCommerce
- Local system
- Marketing automation
- Payments
- Protocols
- Streaming
- Configure a dataflow
- Advertising connection dataflow
- Analytics connection dataflow
- Batch cloud storage connection dataflow
- Streaming cloud storage connection dataflow
- Consent & Preferences connection dataflow
- CRM connection dataflow
- Customer success connection dataflow
- Database connection dataflow
- Ecommerce connection dataflow
- Marketing automation connection dataflow
- Payment connection dataflow
- Protocol connection dataflow
- Create a sources dataflow using templates in the UI
- Ingest encrypted data
- On-demand ingestion
- Monitor batch dataflows
- Monitor streaming dataflows
- Update accounts
- Update dataflows
- Delete accounts
- Delete dataflows
- Subscribe to sources alerts
- Save a dataflow as a draft
- Apply access labels to a dataflow
- Create a source connection
- Self-Serve Sources (Batch SDK)
- Overview
- Configure your connection specification
- Self-Serve Sources (Batch SDK) API guide
- Documentation guide
- Streaming SDK
- Get started with Self-Serve Sources (Streaming SDK)
- Create a connection specification for a streaming source
- Update a connection specification for a streaming source
- Update the streaming flow specification
- Test and submit your connection specification for verification
- Document your source (Streaming SDK)
- Documentation self-service API streaming template
- Documentation self-service UI streaming template
- Error messages
- Flow run notifications
- IP address allow list
- Frequently asked questions
- API reference
- Platform release notes