Databricks connector fails due to expired SAS token in AEP

In AEP, the Databricks connector stops working after infrastructure changes or when the SAS (Shared Access Signature) token in the Spark configuration expires or is refreshed. To fix this, update your Spark configuration with a valid SAS token retrieved from the landing zone credentials API.

Description description

Environment

  • Product: Adobe Experience Platform Real-Time CDP
  • Source: Databricks connector
  • Storage: Azure Blob Storage

Issue/Symptoms

  • The Databricks connector stops working after a certain date.
  • Data preview works when creating a new mapping, but dataflows fail during execution.
  • The error message indicates an authentication failure with Azure Blob Storage, for example: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
  • The underlying cause is an outdated or changed SAS token in the Spark configuration.

Resolution resolution

To fix this issue, follow these steps:

  1. Retrieve new credentials for the Databricks source by calling the Data Landing Zone credentials API endpoint.
  2. Obtain the updated SAS token from the API response.
  3. Update your Spark configuration settings in your Databricks workspace with the new SAS token value.
  4. Save and apply the updated Spark configuration to your Databricks environment.
  5. Re-run your dataflow.
  6. Verify that dataflow execution completes successfully and no authentication errors occur.

Note:
The “sv” value in the SAS token represents its service version and may change when tokens are refreshed or regenerated, either automatically upon expiry or manually via credential retrieval APIs. Always ensure that you use a valid, non-expired SAS token in your Spark configuration for successful authentication with Azure Blob Storage.

recommendation-more-help
3d58f420-19b5-47a0-a122-5c9dab55ec7f