Configure explore specifications for Self-Serve Sources (Batch SDK)
Last update: August 9, 2023
- Topics:
- Sources
CREATED FOR:
- Developer
Explore specifications defines the parameters required for exploring and inspecting objects contained in your source. Explore specifications also defines the response format returned when objects are explored and inspected.
Explore specifications are hard-coded and you can simply copy and paste the payload below to your connection specification.
"exploreSpec": {
"name": "Resource",
"type": "Resource",
"requestSpec": {
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object"
},
"responseSpec": {
"$schema": "http: //json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"format": {
"type": "string"
},
"schema": {
"type": "object",
"properties": {
"columns": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"type": {
"type": "string"
}
}
}
}
}
},
"data": {
"type": "array",
"items": {
"type": "object"
}
}
}
}
}
Explore specifications | Description | Example |
---|---|---|
name | Defines the name or identifier of the explore specification. | Resource |
type | Defines the type of the explore specification. | Resource |
requestSpec | Contains the parameters required to explore objects in the connection. | |
requestSpec.type | Defines the data type of the request specification. | object |
responseSpec | Contains the parameters that define the format of the response message returned against an explore call. | |
responseSpec.type | Defines the data type of the response specification. | object |
responseSpec.properties | Contains information pertaining to how the response message is formatted. | |
responseSpec.properties.format | Defines the formatting of the response schema. | object |
responseSpec.properties.format.type | Defines the data type of properties. | string |
responseSpec.schema | Contains information pertaining to how the response schema is formatted. | |
responseSpec.schema.type | Defines the data type of the schema. | object |
responseSpec.schema.properties | Contains information on the columns, type, and items held within a schema. | |
responseSpec.schema.properties.columns.items.properties.name | Displays the name of the file. | |
responseSpec.schema.properties.columns.items.properties.name.type | Defines the data type of the file name. | string |
Next steps
With your explore specifications populated, you can proceed to create a complete connection specification using the Flow Service API. See the Self-Serve Sources (Batch SDK) API guide for more information.
Previous pageConfigure source specification
Next pageSelf-Serve Sources (Batch SDK) API overview
Experience Platform
- Sources overview
- Available source connectors
- Adobe applications
- Advertising
- Analytics
- Cloud storage
- Amazon Kinesis connector
- Amazon S3 connector
- Apache HDFS connector
- Azure Data Lake Storage Gen2 connector
- Azure Blob connector
- Azure Event Hubs connector
- Azure File Storage connector
- Data Landing Zone
- FTP connector
- Google Cloud Storage connector
- Google PubSub
- Oracle Object Storage
- SFTP connector
- Amazon S3 and Azure Blob connector
- Consent & Preferences
- CRM
- Customer success
- Databases
- Amazon Redshift connector
- Apache Hive on Azure HDInsights connector
- Apache Spark on Azure HDInsights connector
- Azure Databricks connector
- Azure Data Explorer connector
- Azure Synapse Analytics connector
- Azure Table Storage connector
- Couchbase connector
- Google BigQuery connector
- GreenPlum connector
- HP Vertica connector
- IBM DB2 connector
- MariaDB connector
- Microsoft SQL Server connector
- MySQL connector
- Oracle connector
- Phoenix connector
- PostgreSQL connector
- Snowflake Streaming connector
- Snowflake connector
- Teradata Vantage connector
- Data & identity partner
- eCommerce
- Local system
- Marketing automation
- Payments
- Protocols
- Streaming
- API tutorials
- Create a base connection
- Advertising
- Analytics
- Cloud storage
- Consent & Preferences
- CRM
- Customer success
- Databases
- Amazon Redshift
- Apache Hive on Azure HDInsights
- Apache Spark on Azure HDInsights
- Azure Databricks
- Azure Data Explorer
- Azure Synapse Analytics
- Azure Table Storage
- Couchbase
- Google BigQuery
- GreenPlum
- HP Vertica
- IBM DB2
- MariaDB
- MySQL
- Oracle
- Phoenix
- PostgreSQL
- Snowflake Streaming
- Snowflake
- Teradata Vantage
- SQL Server
- eCommerce
- Marketing automation
- Payments
- Protocols
- Streaming
- Explore data
- Collect data
- On-demand ingestion
- Filter data at the source level
- Monitor dataflows
- Update accounts
- Update dataflows
- Retry failed dataflow runs
- Delete accounts
- Delete dataflows
- Ingest encrypted data
- Save a dataflow as a draft
- Apply access labels to a dataflow
- Create a base connection
- UI tutorials
- Create a source connection
- Adobe applications
- Advertising
- Analytics
- Cloud storage
- Consent & Preferences
- CRM
- Customer Success
- Databases
- Amazon Redshift
- Apache Hive on Azure HDInsights
- Apache Spark on Azure HDInsights
- Azure Data Explorer
- Azure Synapse Analytics
- Azure Table Storage
- Couchbase
- Google Big Query
- GreenPlum
- HP Vertica
- IBM DB2
- MariaDB
- Microsoft SQL Server
- MySQL
- Oracle
- Phoenix
- PostgreSQL
- Snowflake
- Snowflake Streaming
- Teradata Vantage
- Data & identity partner
- eCommerce
- Local system
- Marketing automation
- Payments
- Protocols
- Streaming
- Configure a dataflow
- Advertising connection dataflow
- Analytics connection dataflow
- Batch cloud storage connection dataflow
- Streaming cloud storage connection dataflow
- Consent & Preferences connection dataflow
- CRM connection dataflow
- Customer success connection dataflow
- Database connection dataflow
- Ecommerce connection dataflow
- Marketing automation connection dataflow
- Payment connection dataflow
- Protocol connection dataflow
- Create a sources dataflow using templates in the UI
- Ingest encrypted data
- On-demand ingestion
- Monitor batch dataflows
- Monitor streaming dataflows
- Update accounts
- Update dataflows
- Delete accounts
- Delete dataflows
- Subscribe to sources alerts
- Save a dataflow as a draft
- Apply access labels to a dataflow
- Create a source connection
- Self-Serve Sources (Batch SDK)
- Overview
- Configure your connection specification
- Self-Serve Sources (Batch SDK) API guide
- Documentation guide
- Streaming SDK
- Get started with Self-Serve Sources (Streaming SDK)
- Create a connection specification for a streaming source
- Update a connection specification for a streaming source
- Update the streaming flow specification
- Test and submit your connection specification for verification
- Document your source (Streaming SDK)
- Documentation self-service API streaming template
- Documentation self-service UI streaming template
- Error messages
- Flow run notifications
- IP address allow list
- Frequently asked questions
- API reference
- Experience Platform release notes