Analyze and visualize omni-channel insights in Tableau using Query Service
Last update: February 14, 2025
- Topics:
- Queries
CREATED FOR:
- Intermediate
- Developer
Learn how you can use Adobe Experience Platform’s Query Service with external data visualization tools using a churn analysis example. For more information, please visit the Query Service documentation.
Transcript
Hello, everybody. In this video, I will walk you through Adobe Experience Platform Query Service and cover how it can help you analyze data and connect to external data visualization tools, such as Tableau. Hello, everybody. Query Service is a powerful SQL-based analytics tool that helps customers unlock profile, audience, activation, journey, and operational insights on top of readily available unified customer data with high concurrency. Query Service also allows you to query and transform omnichannel big data to the right granularity and power-enriched analysis in downstream BI and machine learning platforms for ingestion into the real-time customer profile. CitySignal is a telecom company and is interested in analyzing customer churn behavior. They want to understand the primary reasons and factors that drive customers away from CitySignal. CitySignal wants to use Adobe Experience Platform Query to understand customer churn behavior and leverage the platform to combine multiple data sources to create a unified data set that can be analyzed in Tableau. Here are some of the questions they want to answer. So let’s take a look at how CitySignal can accomplish these goals. Hello, everybody. First thing we need to do is ingest data into Adobe Experience Platform. On the Sources page, I already have a data source created and connected to Amazon S3, a cloud storage system. We can also use other ways to ingest data, such as Azure Blob Storage, Google Cloud Storage, or SFTP, to name a few. We have four data sets for CitySignal customers, which contain data for churn, call center, profile, and websites. Each of these data sources are tied to a schema and a data set. Now, if we go to the Data Sets section, we can see these four data sets created based on the four files we ingested using S3. Next, we’ll go to the Query Service UI and run some basic queries to confirm if the data makes sense. We usually call the stage of the process SQL Explorer. Let’s click on New Query to get a list of total churned customers by month. Click Play and run the query. Once it executes, we can see how many churned customers we have for the last six months. The next query I’ll run will show the monthly bill range and the total count of customers tied to each revenue range. Based on the output, we can see that most customers are paying more than $150 a month. Note that we can run the same query in PostgreSQL, which you can see here. In addition, we can also use the Query Service REST API to programmatically write, execute, and schedule queries. Hello, everybody. The next step is to consolidate these separate data sets into a single data set. We call this step of the process SQL Prep, where we can write a SQL query and output a brand new custom data set which meets your business requirement. Once we write a query, we can either click on the output data set option, or we can do it using a SQL query. In this case, I have written a CTAS or create table as select query to combine these data sets. Please note that I used an underscore for the alias name to populate the output data set. Hello, everybody. Given that I already ran this query earlier, I’ll now go to the data sets tab where we can see the new data set called cs underscore demo underscore output. If you take a quick peek at the columns, we can see that all columns have an underscore in them, which matches the alias we used in the CTAS query. In the last step of the demo, I will integrate the consolidated data set with Tableau. We call this process SQL Interactive or BI tool integration. We need to go to the credentials tab in query service, which has all the necessary authentication information needed to connect to any external platform via pSQL. Hello, everybody. Next, in the Tableau desktop app, we need to enter the pSQL credentials to connect to query service. My Tableau instance is already connected to query service. Now that we’re connected, you can see all the columns we created as part of the CTAS query that we ran in the query service UI. Now we’ll go to the dashboard I created where you can see that from October 2020 to March 2021, CitySignal had almost 50,000 customers with an average monthly spend of $136. We can also see that about 7,000 customers churned during that time, which gives us a churn rate of 14%. If we drill deeper, we can see a breakdown of churned and existing customers by service. We can also look at the churned customers by month in the form of a line graph, followed up by reason of churn, which is going to be very valuable for CitySignal. In this case, we can see that 29% of customers churned because they found a better deal elsewhere. Finally, we have a breakdown of the monthly bill range, how long were customers members before churning, and total churn rate by US West Coast. The dashboard contains a lot of different sheets, which you can see in this Tableau file. All this ties back to the original goals which CitySignal had set up for this analysis. You have just seen how Adobe Experience Platform query service provides customers with a rich set of SQL-based analytic capabilities to provide rich insights on customer data collected from all channels and devices to surface these insights in external visualization tools like Tableau. Hello, thanks for watching.
Previous pageTableau
Next pageBuild dashboards using BI tools
Experience Platform
- Platform Tutorials
- Introduction to Platform
- A customer experience powered by Experience Platform
- Behind the scenes: A customer experience powered by Experience Platform
- Experience Platform overview
- Key capabilities
- Platform-based applications
- Integrations with Experience Cloud applications
- Key use cases
- Basic architecture
- User interface
- Roles and project phases
- Introduction to Real-Time CDP
- Getting started: Data Architects and Data Engineers
- Authenticate to Experience Platform APIs
- Import sample data to Experience Platform
- Administration
- AI Assistant
- Audiences and Segmentation
- Introduction to Audience Portal and Composition
- Upload audiences
- Overview of Federated Audience Composition
- Connect and configure Federated Audience Composition
- Create a Federated Audience Composition
- Audience rule builder overview
- Create audiences
- Use time constraints
- Create content-based audiences
- Create conversion audiences
- Create audiences from existing audiences
- Create sequential audiences
- Create dynamic audiences
- Create multi-entity audiences
- Create and activate account audiences (B2B)
- Demo of streaming segmentation
- Evaluate batch audiences on demand
- Evaluate an audience rule
- Create a dataset to export data
- Segment Match connection setup
- Segment Match data governance
- Segment Match configuration flow
- Segment Match pre-share insights
- Segment Match receiving data
- Audit logs
- Data Collection
- Collaboration
- Dashboards
- Data Governance
- Data Hygiene
- Data Ingestion
- Overview
- Batch ingestion overview
- Create and populate a dataset
- Delete datasets and batches
- Map a CSV file to XDM
- Sources overview
- Ingest data from Adobe Analytics
- Ingest data from Audience Manager
- Ingest data from cloud storage
- Ingest data from CRM
- Ingest data from databases
- Streaming ingestion overview
- Stream data with HTTP API
- Stream data using Source Connectors
- Web SDK tutorials
- Mobile SDK tutorials
- Data Lifecycle
- Destinations
- Destinations overview
- Connect to destinations
- Create destinations and activate data
- Activate profiles and audiences to a destination
- Export datasets using a cloud storage destination
- Integrate with Google Customer Match
- Configure the Azure Blob destination
- Configure the Marketo destination
- Configure file-based cloud storage or email marketing destinations
- Configure a social destination
- Activate through LiveRamp destinations
- Adobe Target and Custom Personalization
- Activate data to non-Adobe applications webinar
- Identities
- Intelligent Services
- Monitoring
- Partner data support
- Profiles
- Understanding Real-Time Customer Profile
- Profile overview diagram
- Bring data into Profile
- Customize profile view details
- View account profiles
- Create merge policies
- Union schemas overview
- Create a computed attribute
- Pseudonymous profile expirations (TTL)
- Delete profiles
- Update a specific attribute using upsert
- Privacy and Security
- Introduction to Privacy Service
- Identity data in Privacy requests
- Privacy JavaScript library
- Privacy labels in Adobe Analytics
- Getting started with the Privacy Service API
- Privacy Service UI
- Privacy Service API
- Subscribe to Privacy Events
- Set up customer-managed keys
- 10 considerations for Responsible Customer Data Management
- Elevating the Marketer’s Role as a Data Steward
- Queries
- Overview
- Query Service UI
- Query Service API
- Explore Data
- Prepare Data
- Adobe Defined Functions
- Data usage patterns
- Run queries
- Generate datasets from query results
- Tableau
- Analyze and visualize data
- Build dashboards using BI tools
- Recharge your customer data
- Connect clients to Query Service
- Validate data in the datalake
- Schemas
- Overview
- Building blocks
- Plan your data model
- Convert your data model to XDM
- Create schemas
- Create schemas for B2B data
- Create classes
- Create field groups
- Create data types
- Configure relationships between schemas
- Use enumerated fields and suggested values
- Copy schemas between sandboxes
- Update schemas
- Create an ad hoc schema
- Sources
- Use Case Playbooks
- Experience Cloud Integrations
- Industry Trends