Why Dashboards Don’t Deliver Insights (And What Actually Does)

“We need a dashboard that gives us insights.” Sound familiar? The truth is, dashboards aren’t designed to deliver insights—they’re just the starting point.

In this session, you’ll learn a practical framework using three distinct tools—dashboards, narrative reports, and deep analysis—to actually uncover the why behind the numbers. Walk away with a clear system to turn data into action and finally escape the endless cycle of empty dashboards.

Transcript

Alright everybody, welcome to the Skill Exchange. Today we’re going to talk about something I’m super passionate about, which is dashboards and especially insights. I have a lot to cover, so we will jump right on into it. Here’s a list of the things that we’re going to cover today. We’re going to first talk about what is an insight, how should we define it, and what makes an insight, what are the ingredients that make it up. I have a little bit of a thought experiment to talk about how we should think about insight generation, and then we’re going to discuss how this system that I’m proposing of insight generation functions, and then we’re going to put it all together at the end. But first, a little bit about me.

When I’m not thinking about data, which is a vanishingly small amount of time these days, my true passions really lie in the outdoors. I love photography, I love hiking, and I especially like taking pictures while I’m hiking. I’m currently located just very close to the Olympic National Forest, which is my favorite place to be. I have three dogs, or three dogs, I have three kids, I have one dog, and I’m currently the director of insights and experience at BlastEx Consulting. But enough about me, let’s bust right on into this. So I do wish that we were in the same room right now so that I could see by show of hands just how many people have heard some sort of this following lament, which is we just need a dashboard that’s going to get us more insights. And today I want to talk about why this mindset is fundamentally flawed and how you can set up a system that actually will generate insights.

So what is an insight? What even is one anyway? Before we really can understand what it takes to find an insight, we really need to better establish what it is that we’re looking for. So I’m going to go through a series of slides here and I want everybody to mentally take note to yourself when you believe that you have heard something that you think qualifies as an insight. All right, so our orders are down by 15% this month compared to last month.

On the first of this month, we launched our new website redesign. The majority of the decrease in orders can be attributed to a 25% decrease in add to cart rate once the new site redesign launched.

The add to cart rate decrease is limited to a specific product type, type A, we’ll call it. The rest of the product types are performing just fine. So does anybody think we found an insight yet? If not, we’ll just keep on going. In the aggregate, average page load time across the entire site has slowed slightly this month.

And while the overall page load times are unchanged, type A products have slowed by an average of two seconds compared to the last month. Is this an insight yet? I think a lot of people at this point may have mentally said that we’re at the point where we found an insight, but I think we can continue to go a little bit further.

Product images for product type A were not optimized for mobile within the new redesign. I think at this point, this is where the wheels are starting to turn for a lot of people. I think we’ve got all of the ingredients and we just need to connect some dots here to bring us to the finish line.

So putting it together, we see conversions dropped by 15% this month because mobile site speed increased by two seconds for PDPs of a particularly popular product type because images for this product type were not optimized properly with our most recent site redesign. Boom. Now maybe you raised your hand before this, but for me, this is the point where I feel we have finally reached what I would consider an insight. Now, importantly, this doesn’t mean the previous items weren’t of any value. In fact, they were important data points. They were important observations that when we put together in the right moment with the right context, eventually led us to an insight.

So back to the definition, what is an insight? There are so many definitions out there, but for the purposes of this presentation, I would also like to add my own definition to the list. An insight is a shift in understanding that reveals critical patterns or anomalies and when strategically applied to the right context drives meaningful change. Let that sink in for a second. At its core, an insight comes from the interaction of two essential ingredients. These ingredients being these patterns or anomalies in the right context. Now you can have all the data and all the context in the world beautifully organized and meticulously tracked, but without revealing patterns or anomalies, it’s just static. And conversely, you could find a super interesting blip or a breakthrough or a trend, but if you apply it to the wrong context, it’s not an insight. It’s just noise.

Take Alexander Fleming, for example. When he noticed that there was mold on his Petri dish that was killing bacteria in the right context, that anomaly became the discovery for penicillin, a genuinely history shaping insight. But that same moldy dish in my refrigerator during spring cleaning, I would have tossed it in the trash without a second thought. It’s exact same pattern, the exact same anomaly, but completely different contexts and therefore no insight. So without the connection of these two ingredients, we will constantly find ourselves to be data rich and insight poor.

So what do I mean when I say patterns or anomalies? What might you be looking for? I find it extremely helpful to anchor myself on the things that clinical psychologists and fellow obsessor with insights, Gary Klein says, make up an insight.

So he breaks it down into four categories, connections, contradictions, coincidences, and curiosities. So connections, they are, this is the realization of a relationship between two or more pieces of information that were previously seen as unrelated. Contradictions are represented by an inconsistency between what is expected or believed and what is actually observed or experienced. Coincidences are unexpected alignments or correlations between events or pieces of information. And then finally we have curiosities. These are things that intrigue you or puzzle you. They are anomalies or oddities that don’t really fit into your existing expectations.

So as our definition states, data and observations only become insights when you combine the right set of patterns or anomalies with the right set of contextual information. It seems easy, but this contextual information can come from a huge number of locations. Each anomalous data point has nearly an infinite number of directions you can investigate. You can add filters, segments, audiences. You can compare it to previously established targets or any number of dimensional breakdowns. It can be other correlated metrics. It could be external data points or your own historical experiences or intuition. This infinitely flexible requirement for finding connections and contradictions and curiosities means dashboards are kind of doomed to fail from the start. They’re too rigid and by their very nature they could never satisfy the requirement of flexibility needed to connect all of these different things to find insights.

So when thinking about this, it kind of highlights just how absurd it would be to expect to be able to make the perfect dashboard that was able to deliver the perfect insights every time. I mean, imagine you’d have to know in advance exactly the right segments, drop-down filters, date ranges, dimensions, and the most relevant breakdowns, metrics, and even the right visualizations that would be needed and necessary to find the insight before anything has even happened yet. Before you’ve even collected that data. I mean, forget about hitting a moving target. That’s like trying to hit the bull’s eye on a target that hasn’t even been created yet.

So what does this mean? Does this mean that dashboards are dead? Absolutely not. Dashboards absolutely have their role and a role that I would say is even vital in the process of insight gathering. But ultimately you need a different tool set to find that ever elusive insight. So to illustrate the role that dashboards have in the insight generation process and what other systems are needed along the way, I want to present to you a thought exercise. So think for a second about the following three tools. Binoculars, a magnifying glass, and a microscope. They all magnify things, but they have completely different purposes. And in fact, you’d look pretty silly using one of them in the wrong place. Imagine pulling out binoculars in a situation that necessitated a microscope or pulling out a magnifying glass in a situation that required binoculars. It would be really easy in those moments to blame the tool for not getting you the results that you needed, but at some point you would need to recognize that you’re using the wrong tool for the job.

Binoculars, they are best used from a vantage point. They’re best used while standing still, when you need to look at something in relation to the bigger picture. Magnifying glasses are more nimble. They move around with you. They don’t go too deep.

Microscopes are best when you’ve found a singular thing that you want to investigate more deeply, to break it down to its elements and to understand it, to understand why it is what it is. So binoculars, these are your dashboards. They pick out things from the big picture. They are not meant to be nimble. Think about standing on top of a fire lookout, for example. The binoculars tell you where you should go investigate for a fire, but then when you walk over to that thing that’s worth investigating, you don’t pull your binoculars back out to look at something that’s on the ground. That would be silly. You need something that’s more purpose-built. So that’s where your magnifying glass is going to come into play. It doesn’t go too deep, but it does look at things more closely to explain a narrative. This is going to be our narrative reports. And then when you’ve found something that’s truly worth inspecting and going deep on, that’s where you really want your scientific process. This is your analysis, where you focus deeply on that particular topic and then you come out of it with some hypothesis to test out. So in my proposed system here, each tool has its place, and when you have the system set up properly, each tool feeds into one another.

All right, so let’s start with dashboards. Let’s talk about these individually. What do we think about, how should we think about them in this system and what goes into them? Well, obviously, the first thing you need is clean data. Nothing goes anywhere here if your data is in poor shape. If you cannot trust your data, you got to start there. That said, once you’ve got the foundation laid of clean data, you obviously need some reporting requirements. And I always like formatting these requirements in the form of user stories. So just think something like, as a role, I need to monitor specific KPI in order to make a certain decision so that I can achieve a specific outcome at the cadence of a particular cadence, depending on the situation. And I will know that it’s working when I receive a specific signal, metric or trend or alert condition. So putting this together, an example as a digital marketer, I need to monitor cost per acquisition, ROAS and click through rate by campaign in order to pause underperforming campaigns or reallocate budget efficiently on a weekly cadence so that I can maximize return on ad spend without overspending. And I will know it’s working when at least 80% of my campaigns are above target ROAS.

So when it comes to dashboards, what tools should we use for these dashboards? The beauty of Adobe Analytics is you just have so many options depending on your audience. You can use Report Builder. You can use Power BI, Analysis Workspace. Just make sure that it’s a centralized location so that everybody that needs access to it can get it when they need it. Pay special attention to cross-platform needs as well. Do they need a mobile version? I love that Analysis Workspace has a fantastic option for mobile dashboards for executives. I’ve found that to be quite successful.

So when you’re building these dashboards, though, what are some best practices? We definitely do not want to try and add every possible breakdown of every dimension that they could possibly want with the intent of preemptively answering the whys that may come up. These are not analysis and insight gathering tools. But what you should do, however, is first off, include targets. Every dashboard should be able to answer at least one core question. Is this good? Is this number good? The only way that we can know this is by including clear targets or goals beforehand.

And don’t avoid setting targets just because you’re afraid of being wrong. This is a common thing. The worst thing is having to figure out if your numbers are good after the fact. So setting targets isn’t always easy, but it’s essential for this context. If you want some really great practical advice, I would check out the book Analytics the Right Way by Tim Wilson and Joe Sutherland. It’s full of helpful tips on working with your stakeholders to set the right targets. And I can’t recommend this book enough to any analytics professional.

I started reading that after I started this presentation, and there’s a ton of overlap. So be sure to check it out. But in short, no dashboard would be complete without clear targets.

While we don’t want to overwhelm our stakeholders with tons of breakdowns and dropdowns and segments, it is important to work with your stakeholders to set what I would call first layer breakdowns. These are going to be your first places to look, so to speak. If a trend looks funny or you see an anomaly that happens, what’s the first dimension that you’re going to use to break this down, to investigate it just a little bit more? Another thing to do, to make sure that you do, is to rebrand your dashboards internally as performance management scoreboards. I know I’m mixing up metaphors just a little bit here, but like I said, these are not insight-gathering machines. They are scoreboards so that anybody in the stadium, or in this case, the company, can look up at the scoreboard and know if we’re winning or if we’re losing. Another pro tip here is to remember to review the usage of these dashboards regularly. They’re not getting used, they’re not adding value. So go find out why and adjust accordingly.

So that’s our dashboards and our binoculars. So let’s move on to the next step, which is our magnifying glass or our narrative reports. So what is a narrative report to level set? A narrative report is just a short, contextual commentary that’s layered on top of performance data. So performance measurement sounds like a dashboard, but with context and with commentary. It doesn’t say just what happened, but it suggests reasons why it may have happened and where to look next.

So when you’re building these, what are some ingredients that come into narrative reports? So first, you’ve got your business context. And that context can come from lots of different places. It can come from outside the data from the dashboard, or even outside the data set entirely. It could be a recent product launch that happened. It could be a shift in the market. It could be something that your competitor was doing, or a change in ad spend, or just the holiday weekend. Any number of those things are a very important context to consider when putting together this narrative.

Another thing to put in here is any important questions that were raised from the dashboard.

These should be answered here if possible. If your dashboard shows a drop in conversion, this report should explain plausible causes, but without going too deep. If there is not a readily apparent answer that you or regular users of the report can identify within just a few minutes of thought, offer some hypotheses that are worth examining within the report and document those there. I’ll explain also where these hypotheses go shortly.

Lastly, there needs to be a bit of an established cadence. Narrative reports are going to be the most effective when they’re consistent.

You’ve got those text boxes that you can add directly onto your dashboard. I think those are a wonderful place to start adding context with these narrative reports. And a pro tip, it’s helpful to make a copy of that dashboard so that that one can serve as the narrative report while your dashboard can remain unchanged. And you can set those, the one with the narrative reports to send out on a regular basis. So what are the outputs of these narrative reports? It’s going to be your deeper questions, the ones that couldn’t be answered quickly or with a quick drop-down filter or immediately with just a single layer of context added from a subject matter expert. Most importantly, you’re going to get potential hypotheses and observations and ideas that are coming from here. These are going to be your candidates for deeper investigation for AV tests.

You’re essentially curating as you’re doing this a backlog of things that are worth analyzing.

Some do’s and don’ts with your narrative reports here. First, don’t just regurgitate in sentence form what the charts are saying. And second, as mentioned previously, don’t try to answer all of the most complex deep questions in the narrative reports. If it takes more than a minute to answer, log it as a question to be answered later for deeper analysis.

Some things that you should do, answer those simple questions as well as the ones mentioned, put the text right in the dashboard. And another topic that could probably be its own entire presentation is the fact that I think that this is probably the area where the part of the framework where AI is going to have the biggest near-term impact. The reason for that is, well, your inputs for these dashboards and for these narrative reports, their metrics, campaign calendars, targets, user goals, they’re all structured and clear and easily readable and feedable into an LLM that can automatically generate helpful narratives, flag notable changes, suggest likely causes and even recommend the next steps or questions to ask. So let AI handle this part as much as possible and let your analysts review and curate the best outputs.

And lastly, within your narrative reports, and to be honest, within the dashboards themselves, we should definitely have some sort of a link to a hypothesis and observations intake form.

So I’ve mentioned that, let’s talk about what that looks like. Where do all of these questions and hypotheses that you’ve been gathering with these dashboards and these narrative reports go? They go into these intake forms. A critical output of the dashboards, like I said, is going to be areas that need further analysis and potential hypotheses to be validated. These intake forms can come in many different forms. So I’m not really married to any one of these, but there’s a lot of different ways that you can set up a form, whether that be Google Forms or Airtable or Microsoft Forms, but some things that I would make sure to have on there. I would have, what type of entry is this? A couple of options. Is it an observation? Is it a question that needs answering or is it a hypothesis? An observation might be something that, it’s a connection that, or it’s something that may or may not lead to a connection elsewhere. Like, hey, I noticed that users who perform X action have an abnormally low conversion rate compared to the rest of users. It’s an interesting observation that could be used later on. A question that may need answering, is the money that we’re spending on Google cross-network ads yielding a positive ROI or should we position the capital elsewhere? It’s a valid question. Worth looking at. Or maybe it’s a hypothesis, something that has come with a lot of thought here. I believe that incentivizing our users to use blank feature within their first seven days on the product is really going to increase their lifetime retention.

So you’ve got that, is it a hypothesis, a question or an observation? And then have them put a brief description of their entry. So kind of like what I just said there. Then there should be a link to some sort of supporting data if possible. This could be a link to the dashboard or the narrative report that they were just looking at, but you know what? It doesn’t have to be. It could be any number of things. It could be a conversation they have with a colleague or their personal experience on the website or a competitor’s site. My goal here right now is really to lower the barrier of entry for these analysis ideas. I don’t wanna make it prohibitively complicated to enter or to get these ideas in here. I’m really trying to increase the data literacy to the interest of the teams. So right now, the more entries are the better at this point. You really just wanna get the juices flowing and then provide instruction later on and course correct where you feel is necessary. I find that these entries themselves into these forms is a really important data point in and of themselves. It’s a window into the mind of the users of the data. How good is the quality of their questions? What’s their data literacy like? This is a treasure trove of information that you can use internally with change management.

So I have included in my deck here some templates that you can use and you can reuse. I’ve got a Google Sheets and an Airtable option here. You can scan these QR codes. I’m particularly a fan of Airtable, but like I said, these can be built anywhere.

Once these ideas and observations and hypotheses have been submitted, it’s the job of your more seasoned analyst to go and look at all of these line items and then prioritize them based off of things like how difficult would it be to answer this question or validate this hypothesis? What’s the potential impact? And what’s the priority based off of things like current internal initiatives or business goals, et cetera? So we’ve got all these things. We’ve begun to prioritize them. We’ve put in a lot of work. Now let’s start talking about the microscope. So we’ve got the team aligned. We’ve identified a great list of really important questions that are prioritized, that questions that need to be answered or hypotheses that should be validated. And I want everybody to just, before we move on, to notice that the path that we’ve taken to get here, it’s not been random. It’s been very carefully curated and prepared. We didn’t get here on accident. It was a very deliberate process. But now, like I said, it’s time for the analyst to shine. This is the moment we’ve been waiting for, which is our analysis.

So the inputs at this stage, like I said, are very ready to go. You’ve got your hypotheses that are grounded in observed trends. You’ve got your context from the business. You’ve got your questions raised by decision makers and other analysts. You’ve got ingredients now. You’ve got all the ingredients now that you need to find the insights with intent-driven analysis. And all of these steps that we’ve been doing along the way, think of them as seeds that we’ve planted. Now we’re harvesting that value. And when you do this right, what comes out on the other side, your outputs are going to be recommendations from the business that the business can actually act on. They’re gonna be A, B test ideas that are really gonna move the needle. And more importantly, they’re going to be measurable and attributable improvements to your performance.

So speaking of that attribution here, I think at this point, we’re at probably the most important step, which is closing the loop on the business impact of your findings and recommendations. So if you, like me, are sick and tired of having to constantly fight for budget for the analytics teams, or you’ve been unable to accurately or measurably show the worth of the return on investment in your analytics tools, this spreadsheet is going to be a godsend for you. So as you get on the other side of your analysis, your job as an analyst is to come back into this spreadsheet and document the recommended actions that resulted from your analysis. And document if the action was taken. And if it was taken, what date was it taken on? And then you can go forward and measure the impact of those changes. This could be in dollars. It could be an increase in other, but it doesn’t have to be. It could also be increases to other important KPIs. But wherever possible, I find it important to document it in dollars and cents. But this is going to be, these are going to be your receipts that you can have to say, this is the value that we as an analytics team have added to the business.

So this is the whole framework on one slide. It’s very simple. We’ve got your dashboards, which are your binoculars. They help you spot the trends, measure progress and flag when something’s off. But they’re not going to tell you why. That’s where narrative reports are going to come in.

These are your magnifying glass. They add context, they answer simple questions, and they highlight where deeper analysis is needed. And then finally, you’ve got your microscope. This is where the real insights come from, where we dig into the data, we have validated hypotheses, and we deliver our business recommendations that are going to move the needle, like I said. And when these three layers, the insights, they don’t happen by accident, they happen by design. So my final takeaways.

First, stop expecting dashboards to deliver insights. Dashboards are not insight engines, they are binoculars. They’re great for scanning the horizon, for spotting changes, but it would be incapable of explaining why things are happening. So instead, redefine your dashboards as performance scoreboards. Use them as something to monitor what’s happening, why it’s happening.

Number two, insights require both patterns and anomalies and context. So an insight is the moment when a meaningful anomaly or pattern meets the right context, and together they will shift your understanding in a way that leads to action. This is why no static report or dashboard can reliably generate insights.

So because insights require these patterns, anomalies, and context, if you really want to find more insights, you need to create a system that surfaces anomalies that are relevant to your business, this is gonna be your dashboards, and then adds the necessary context through your narrative reports, and then refines and transforms those anomalies into insights via analysis and hypothesis validation.

Then, number four, set up an intake and prioritization framework. So you won’t be able to chase down every spike or anomaly, of course. You need a system that is gonna gather and categorize and prioritize these hypotheses into ones that you can take one by one. I am a big fan of, like I said, using a centralized repository, something like Google Sheets or Airtable, to make sure you’re tracking these ideas and assigning owners, and most importantly, closing the loop, which gets me to number five. So an insight isn’t going to be a valuable insight until it changes something. So if you’re not measuring the results of your recommendations, you’re leaving that value on the table. And these, like I said, these are going to be your ROI receipts that you can take to your boss and you can put on your resume.

All right, thank you so much for listening to this talk. Like I said, I’m super passionate about this topic, so I’d be more than happy to answer any of the questions that you might have.

Thank you so much, Brad. Wow, all of those were great insights and valuable tips on how we can turn data into action. All right, let’s keep the conversation moving along. Send in your questions for Brad, and we’ll get through as many as we can.

I think we’ve already got a handful coming in here. Okay, we’ll get started with this first one. What’s the most common error you see when teams set out to build a dashboard? Yeah, thanks for the question. That’s a great one. You know, I think really what it comes down to with that one is the overwhelming aspect of the dashboards. People really want to build dashboards so that people can see data. That’s the whole point of having a dashboard is great. People are excited about having data. They’re excited about being able to answer the questions that are important to their business. So why don’t we put it on a dashboard so that we can all see it? And then there starts to come all these different questions. Well, what about this view? What about that view? Can we compare this dimension? Can we break it down by that dimension? And there starts to become this level of overwhelm for these dashboards and just trying to answer too many questions at once. And I think that’s probably the biggest issue, number one. Number two really comes down to those targets. I mentioned in here, really dashboards are meant to be, they’re built to be these scoreboards. They’re meant to give everybody a really quick idea of how well we are performing. If you can’t look at a dashboard and know the answer to that question right away, is this number good? Is this number bad? Then we’re really not doing our job here when we build this dashboard. So one of the things that’s most important there when it comes to building those dashboards is those targets. Do we set out, did we do what we set out to do? And that is so important when it comes to the building of the dashboards. And so I find that a lot of people will skip that step, whether that be because I don’t really feel comfortable with predicting the future or, well, we’ve never done this thing before, so we don’t know exactly what the number should be. They come up with, there’s plenty of excuses not to come up with targets, but having some level of a target beforehand is gonna be so important when you get to the end and you’re measuring things and you’re looking at that scoreboard and you’re saying, well, did we accomplish it? Did we do what we wanted? And that’s gonna be your step one towards getting you towards those insights. I think that’s a great point. Kind of holding yourself accountable, right? Like having, here’s the target, this is what we’re actually aiming towards so you can track to it, I think is so important.

So here’s a question. I know it depends on company toxicity, a very good point, but any tips on turning insight into actual action, meaning navigating politics? Please, Brad, tell us about politics. Yeah, man, I love to talk about politics. No, this is a really solid question because it’s something that I find that I run into quite a bit. You will find some level of an insight, something that you find to be really important. You know, I’ll just do quick anecdote here. I remember I had a client where they had a, it was resorts and they had, it was in the South Pacific and it was, they had two resorts on two different islands. And what they were doing is unfortunately they were sending Google ads, clicks for one resort to the landing page and the booking page for the other resort. And so people were getting there, getting to the very end of the booking page and they were hitting the, you know, I’m gonna go ready and I’m gonna book this. And then they were really upset and they had to go back and cancel it because they meant to book for the other island.

And, you know, that was an insight that I thought was quite important. We should make this change, the customer’s website. Sounds important, I mean, yeah. That was like, it would be a little bit important, but you know, that insight did not get heard. It did not get moved. The thing did not change even several weeks later. And so navigating the politics is difficult, but the thing that I guess I would say has helped me be the most successful there is you really have to understand your stakeholders really well and understand what makes them tick. And the key that I have found is if you can get down, if you can take your insight and not just give them the insight in terms of like what I did maybe at that time, which was this is happening and it’s bad. The next level is like, this is happening, it’s bad. And it’s impacting conversion by this much. And then go one level deeper and say, this is happening. It’s bad, people are upset. It’s impacting conversion by this much. And therefore it’s impacting our bottom line revenue by this much. People talk and they think in dollars and cents. And so if you can take your insight and make that connection to what matters most to them, most of the time being dollars and cents, I find that that will get people to move more often. And if they don’t move, then it will at least start the conversation that will get us there. Like maybe they didn’t make the action because they’re not in charge of it. And they don’t have the ability to make the change. But if you talk into the language that they understand and that they care about for their job, then they’ll go find the person who can make that change. And you can start to have that conversation.

Perfect. Yeah, no, I think that’s super good insight there. I want to follow up with a question that someone else asked that I think kind of goes off of what you’re saying. First of all, they said amazing presentation. I agree. Thank you.

Do you have any recommendations to increase stakeholder engagement? So, you kind of talked generally there, but would love to kind of get your thoughts on how do you get people who are pretty high up the chain maybe, who are looking, you’re wanting them to engage with it as well as like actually use that to make some changes. How do you get them more involved in like… When we say engagement, like engagement with dashboards or engagement with the reports. Yes, like how do you get them to actually use the dashboards that you’re spending all this time creating, right? Like I’d love to get your thoughts on that. Yeah. I think it can be maybe overwhelming to try and build a lot of dashboards at once or try and solve all of the problems at once. And so something that I might recommend is start with one really important KPI. Start with something that, again, you got to speak in the language that they understand, speak in the language that is going to make them make decisions. And so if it’s one very specific KPI and a part of the customer journey, that’s very important to that stakeholder, then start there and start there with that scoreboard, start to build out the, again, the targets. What are we trying to accomplish here? What is this person trying to accomplish here with their role? And if we can understand that, if we can show how well that we are doing those things, then that’s great. We’ve got that scoreboard. And then just walk yourself through the process that I put in this presentation here. Start with that scoreboard, start to understand are we succeeding or are we failing, start to build that narrative and start to find those insights based off of that narrative that you’re building from the narrative reports and things like that. Because if you do that, if you find that value, people are gonna start to care and they’re gonna have a hard time ignoring it, I would say.

Okay, so this person has said, I see your final recommendation is crossing several analytics teams. Which teams are responsible for each of the three phases and do you have a recommendation on a minimum required staffing for each or where to invest with limited resources? That’s a full question. The meaty one. Yeah, yeah. And unfortunately, I’m probably gonna come back with a, and it depends on that one because I think it will change depending on the size of the company, the number of resources that they have.

A lot of times, honestly, the vertical that we’re talking about, that can change as well. So it will depend, but let’s see.

The other question was where to start with them. Remind me what the crux of the question was, Amber.

Where do you start by investing with these limited resources? So across the three different phases, where would, what’s the recommendation across? Yeah, yeah, yeah. So I would, again, I would start with what, start small, start with one KPI. And what I like to do is build everything around the customer journey. I think the customer is going to be the most important part. And so the more that you can do to center around that, the better. So let’s take our most important audience, let’s take our most important and influential audience and their most important customer journey that they’re going on. And let’s identify those moments across that customer journey that are the most pivotal.

What is the thing that, you know, it’s make or break it through this step in the customer journey? I’d start there, I’d start with that KPI, and I’d start to build one dashboard on that and see if you can get that consensus around those KPIs. One thing that I’ve had a client in the past where we did something like that, you know, it’s a lot of teams trying to wrangle a lot of people and get everybody on the same page. And we started with just the most basic KPIs that they had, for example, these dashboards that were manually cobbling together in Excel files. And we just said, you know what, why don’t we find a way to build these targets in Adobe Analytics? And we put that together. And having that centralized location where everybody could show up, everybody could look at it and speak the same language became extremely valuable. And we really started to see a lot of benefits come from just having that scoreboard, that everybody was rowing the same direction. Everybody knew that when they did something, they could go to that dashboard and see how it performed later on.

And we started to build from there. More people started to say, wow, this is really great. I would really love to see this for my department. I’d really love to see this for this lower part of the funnel or this upper part of the funnel. And we’ve started to grow from there and build out these practices. And again, it started small, but as it grew, we started to get more people and get more KPIs and get more things that we… And so don’t try to do it all at once. Don’t try to eat the whole elephant. Pick one small thing at a time and move on from there.

You gotta eat that elephant one bite at a time. Yeah, another bite at a time. Great point.

So here’s a question around, this is the input is so important. Garbage in, garbage out. Yes. Very true. Any advice on staying on top of data quality? Yeah, and that’s… I only had like one small slide on this, but it’s so important and so pivotal is that data quality. Cause if you don’t have good data quality, literally nothing else that I just said in this entire presentation matters. So you really have to start there. If you feel like your company is not quite at the maturity level where it’s… We’re making these dashboards that are getting us to these insights or we don’t really trust our data, start there.

Get strong with your data quality and the trust in the data. Now, where I would start there is… Or one of the things that I like to do is set up our KPIs when it comes to the quality of that data. Like, I don’t know, everybody’s seen on the side of a wall in a cartoon, like it’s been zero days since the last incident. Have something like that for your team. Like it’s been zero days since the last data quality incident. Like how long can we go between issues with data quality and or maybe like what’s the percentage uptime in our most important KPIs over a certain period of time. Start to establish those KPIs and get people to rally around those.

And it can’t be said enough, things like alerts, you know, let anomaly detection do its job there, let AI do its job there. And then I just on old school, I love the health dashboard. The Adobe analytics health dashboard gives me a bird’s eye view of everything that’s going on. Check that regularly, but you really just have to start establishing best practices, have a high quality data layer and get those things in shape. And then you can start to move on to the more fun stuff.

Okay, Brad, we are close to being done here. Any final thoughts that you’d like to share with the group? Oh man, you know, this is just something that I’m super passionate about. I love, I just love insights, the idea of them and to the point where I think maybe I get a little bit, I can get a little bit like nails on a chalkboard if I see people using, just using that word too much, like I found these insights and it’s like, well, page views, why not? Like, well, okay, it’s not really helping me get anywhere. The entire purpose of capturing all of this data is to do something with it. And that’s really all that we want to emphasize here. And that interlocking between the, or the interfacing between the context and those different contradictions, curiosities, though that’s where a true insight comes. And the most important thing is once you’ve found those insights is to close that loop.

Everybody here I’m sure that is listening on this call is probably tired of having to justify the value that their analytics team brings. If you want to have an easier way to measure your value, if you follow through with this process, you will have that, those receipts essentially to come back and say, this is how much value that we added to the business. And it’s really fun to get to that point.

Perfect. Okay. We are at time. Brad, you’ve been amazing. Thank you so much for spending time kind of walking us through that and for having a fun Q&A with me. It’s always great to chat. So I appreciate it. Thanks for having me.

Transforming Data into Meaningful Action

Unlocking the power of data requires more than just dashboards—it demands a system that turns observations into actionable insights.

  • Insight Defined An insight is a shift in understanding that reveals critical patterns or anomalies and, when applied in context, drives meaningful change.
  • Three-Layer System Dashboards (binoculars) spot trends, narrative reports (magnifying glass) add context, and deep analysis (microscope) delivers recommendations.
  • Key Ingredients Patterns, anomalies, and context must intersect for true insights to emerge.
  • Actionable Outcomes Prioritization frameworks and closing the loop on business impact ensure insights lead to measurable improvements.

This approach helps users move from static data to dynamic, business-changing decisions—making analytics truly valuable.

Defining True Insights

  • Insights are not just data points—they require both a pattern/anomaly and the right context.

  • Four key triggers for insights:

    • Connections Uncovering relationships between previously unrelated data.
    • Contradictions Spotting inconsistencies between expectations and reality.
    • Coincidences Identifying unexpected alignments or correlations.
    • Curiosities Investigating anomalies that defy expectations.
  • Without context, even the most interesting data is just noise.

  • Actionable insights drive change and must be measurable to demonstrate value.

recommendation-more-help
82e72ee8-53a1-4874-a0e7-005980e8bdf1