Error “Table not provisioned for dataset” when querying datasets
If you encounter the error "Table not provisioned for dataset" when running a query against a dataset in Adobe Experience Platform or Adobe Journey Optimizer, it means the dataset doesn’t contain data in the Data Lake, even though it might contain profile data. The issue is commonly caused by ingestion methods that populate Profile, but don’t write data to the Data Lake.
Identify the dataset type and ingestion method to confirm whether the behavior is expected or whether corrective action is needed.
Description description
Environment
- Adobe Experience Platform (AEP)
- Adobe Journey Optimizer (AJO)
Issue/Symptoms
- Queries against a dataset fail with the error Table not provisioned for dataset.
- The error occurs whether using Query Service or an external database tool.
Root cause
The error doesn’t indicate a provisioning failure. It means the dataset has no data stored in the Data Lake. This occurs when either no ingestion happened or when data was ingested using Partial Row Updates, which writes to the Unified Profile Service but not to the Data Lake.
Adobe Journey Optimizer also uses Partial Row Updates for:
- AJO Consent Service Dataset
- AJO Push Profile Dataset
- The Update Profile action in journeys
Querying these datasets always results in the Table not provisioned for dataset message because the system doesn’t store that data in the Data Lake by design.
Resolution resolution
- Identify the dataset you’re querying. If it’s the AJO Consent Service Dataset or AJO Push Profile Dataset, the error is expected. These datasets ingest data directly into Profile, not into the Data Lake, so nothing is available to query.
- Identify the ingestion method used. The error is expected when you ingest data using the Update Profile action in a journey or using Partial Row Updates. No corrective action is required.
- If the dataset isn’t a system dataset and none of the above conditions apply, review your ingestion methods and dataflows. No data was ingested into the Data Lake.