Building a Data Driven Culture

Join Adobe Analytics Champion Gitai Ben-Ammi as he discusses what a data-driven culture looks like and shares strategies for using Adobe Analytics to build one.

Transcript

Hello, everyone. I hope you’re all having a wonderful day. I want to welcome you all to our Adobe Analytics webinar, Building a Data-Driven Culture. Today, you’ll hear from one of our expert Adobe Analytics champions, Gita Ben-Ami, as he discusses what a data-driven culture looks like and share strategies for how you can use Adobe Analytics to build one at your organization. My name is Alyssa Magruder, and I’m a Customer Marketing Manager here at Adobe, and I’ll be your host today. Before we jump into our presentation, I want to go over a bit of housekeeping. But while I do that, I invite you to say hello to one another in the chat. To start, this webinar is being recorded, and after the webinar, we’ll be sending out a copy of the recording, which will also include the slides. There are also a few things on your screen that I’d like to point out. First thing, the console that you’re seeing in front of you is completely customizable, so you can resize or minimize any of the widgets on your screen. Feel free to make the slides or the video larger or smaller based on your preference. Second, we’ve shared several resources related to the webinar today, and you can find those in the related content panel on the top right side of your screen. Throughout our session, if you have any questions for our presenter, simply type your question into the Ask the Presenters box on the bottom center of your screen. We’re going to do our best to get to all of the questions at the end of the session, but if we don’t, we’ll follow up with a discussion thread on the Adobe Analytics community, which is linked in the Related Content section. If you’d like to talk to other attendees, you can use the attendee chat panel. This is a great way to hear from others and share your own experiences during our session. If you haven’t already, open that up and drop a quick hello. And finally, at the bottom, you’re going to find your webinar console. Here you’ll find additional information about our speaker and a survey. Please be sure to take the survey before you leave. That’s how we pick topics for future webinars as well as presenters for other sessions. There’s also a reaction button which can be used to give our presenter a bit of love. Click on the emoji button and feel free to try that out now. Now onto our agenda. Today we’re going to be discussing the importance of building a data-driven culture. How do you know if your company has one, and what are some strategies you can use to build one and ensure your teams are empowered through data? As always, the session is live and we’re going to be taking questions. So please type them into the Ask the Presenters box if you want a question answered or type into the chat to hear from others who are attending. With that, I will pass it over to Gitai to introduce himself. Hi. Hello. I am Gitai Ben-Ami. I am a principal digital strategist for Concentrix. So data is my lifeblood and I use data all the time in order to improve customer experiences. I’ve been doing this for a bit over a decade and this is a picture of my beautiful little corgi thy name Perseus because my kid was really, really scared of Medusa after I showed him the 1982 Clash of the Titans movie. So I want to talk about the difference between a data-obsessed culture and a data-driven culture. Data-obsessed cultures talk about data constantly. They love the idea of data. They’re always asking for data, but then they don’t go where the data says they need to go. Let’s look at some of these differences. If you’re in a data-obsessed organization, there will be constant requests for data, but they’re repetitive. They’re the same requests. They’re the same dashboards. Whereas in a data-driven organization, all of the basic information has already been done, is contained and accessed. In a data-obsessed organization, there’s A-B testing. They talk about A-B testing. They say, we’re going to test everything. But if the test results are not what they want, something gets pushed to production anyways. Whereas in a data-driven organization, A-B testing is the guide for what goes out. In a data-obsessed organization, there’s only a handful of people who can actually use Adobe Analytics. I refer to them as the data monks. They’re cloistered away, getting data requests, sending out data, and nobody knows the data very well. Whereas in a data-driven organization, most people who need the data are able to access the tool and use it to do at least basic analysis. And finally, in a data-obsessed organization, leadership only cares about final KPIs. But in a data-driven organization, when leadership sees those final KPIs, whether they’re revenue or engagement or whatever, they ask, what’s the data that’s driving this? And where is your data that shows that that’s the case? We need to get people moved from being data-obsessed to being data-driven. Why? Because data-obsessed organizations create so much waste. There is waste of effort on the part of the analyst, on the part of marketing, product, and everyone else. There is waste of resources as development is driven forward on things that the data shows aren’t working. And worst of all, there is waste of time. We can always get more of just about anything. We can never get more time. So every development cycle that could be spent productively based on where the data is leading us is wasted, and we don’t want that. We want to be a data-driven organization that takes us where the data is leading us and brings us to the best possible conclusions. So let’s talk about what we think data-driven organizations look like. So when a lot of us who are in analytics get in, we think this is how it’s going to work. Marketing or product, they’re going to request the data. We’re going to give them this beautiful dashboard, this thoughtful analysis with clear, obvious conclusions. We’re using all of our data storytelling skills, and it’s right there. And then, of course, marketing a product is going to change course based on the data that we gave them. That’s what we really think is going to happen. But for too many of us, we end up finding this, that the request comes in, we provide that. And again, the results are really obvious from the data. We can see it. We provided all the evidence that we need. And then there’s a decision point on the basis of the requester. Does it match what they already want to do? If yes, they’re going to say, look, the data shows I’m right. Let’s do what I was going to do anyways. And if it doesn’t, they go, I don’t know, that was a weird month. Are we sure that’s right? Let’s just go ahead and do what I want to do anyways. And that leads to what I call the circle of data despair. After a while, you kind of start despairing with this because you really do know what’s right because you’ve got the data, and you want to check out a little bit. I don’t blame you. I’ve been there. I think we’ve all been there unless you happen to be at the best organization ever. But then you see a chance for redemption, a chance to show what data can do. You see the disaster coming. It’s right there. The data is ready for you. You know it’s coming. And so you run and you talk to somebody and you say, look, look, the disaster is coming. And then nothing happens. And they ignore you. And the disaster happens. And then most galling of all, someone asks loudly, how did no one see this coming? And the circle repeats because you saw it coming. And I have this as a circle because this is two-dimensional. This is really a spiral as you keep sinking further and further down. This is not what we want. And let’s look at how the data really flows in a data-obsessed organization.

So we have the analyst in the middle. We’ve got stakeholders, marketing, product. Everybody’s sending data requests in and getting data out. The thing I really want you to notice about this slide is the lack of connection between the decision makers and that data is only flowing in one direction. This is what I talk about when I talk about the data monk. When you’re in this kind of organization, only you know the data. And that’s a problem. There are problems that arise when you are the only one who understands how data is created, what the data means, how to get the data. And that issue really revolves around two things. The first one is trust in the data. And the second one is data empowerment. I’m going to talk a little bit more about that later. So how does this happen? As I said, no one knows data as well as you. And leadership is not really driving the data-driven culture. I’m going to talk about the second one a bit later. And for right now, I’m going to talk a lot about the first one. So here’s a question for you. Hold on one second. Period of try. So here’s a question for you. What does this look like to you, someone who knows Adobe Analytics? To me, this blank dashboard is exciting. I’m getting ready to use this amazing tool with flexibility, customizability, all of these wonderful, wonderful things. And I’m going on an adventure. I am getting ready to uncover mysteries, to find one clue after another, to dig in deep to the data and find clues and answers. And eventually, I am going to get to the root cause. This is me going out to slay dragons. And I am so excited every time I open one of these up. Do you know what this looks like to your stakeholders? This is terrifying to them. The same flexibility, customizability, the same ability to use this tool in so many ways with such depth has a steep learning curve. And there’s no denying that. Once you’ve mastered this tool, this is the most amazing data visualization tool out there, bar none. And I tried a lot. But for them, this is not the tool with which you slay a dragon. This is the dragon that they need to slay. So let’s talk about a couple of reasons why that is. So up there, I talk about the word user enablement. Let’s talk about that. So how many people in your organization actually have access to the tool? But more importantly, how many of them are actually logging in? You can see that data. How many people have been logging in recently, last month, two weeks, week? How many have logged in today that should have? If you’re in a data-obsessed organization, I’m willing to bet that the answer is not nearly as many as should have. That’s why we have to stop talking about user enablement and start talking about user empowerment. Some more data-driven culture questions. How often do you offer training? How long is it? How do you onboard your new hires? And most importantly of all, how are you supporting them after training is done? So I think that training has to be long and in-depth. 12 to 16 hours is what I do. I have to make sure that by the end of that training, everybody who’s in my training can put together a basic dashboard all on their own. They need to be able to create some basic segments, do calculated metrics, really basic ones, nothing crazy, and be able to do basic analysis. Why do I want them to have that ability? Because it takes the mystery out of it. Rather than this just being some black box that data comes out of and they don’t know how, they’re able to access it themselves. That increases their investment in it and that increases their ability to trust it, because they know where it comes from. It needs to be really hands-on, lots of exercises. But also, let’s talk about what happens afterwards. You can’t just say, go out there, you have the tools, because what will happen is they will, they’ll get frustrated, and then they’ll give up. Most of them. There’s always that one product manager or marketing manager you really vibe with, and I love those guys. So create a space, use Teams or whatever you’ve got, Google Space, and then create a space that’s going to be in there. Make sure they can chat with each other, files, videos of your training, record it, links to the important stuff. And then every day after training for two weeks, into that chat room, you put in an exercise. Start out real basic. Ask how many visitors came during a particular month, something really simple like that, and then let them figure it out with each other. They talk to each other. They support each other. They build each other up. It is amazing to watch as they up their skills on their own through each other and build community. It is so, so helpful. You are the resource of last resort, and then at the end of the day, they come into your office hours. You have every day of the day, you have a lot of people that are in the office hours, they come into your office hours, you have every day for two weeks, and they show you how they solve the problem. This is really great because for one thing, they see there are lots of ways to solve the same problem, and it’s so helpful. And best of all, they make mistakes, and I love it. And when they make a mistake, and I’m trying, I’ve got my little note here that says to slow down when I’m talking because I get really excited sometimes and I talk really fast. But when they make a mistake in one of these office hours, I lose it. I go, oh, my God, that’s a great mistake. That’s a wonderful mistake. I’m so happy you made that mistake. Let me show you what that means. Let me show you how that came about, and let me show you all the things that you can use that mistake to learn from it. And I just kind of geek out really hardcore. And it’s great for another reason because they realize that it’s a supportive learning environment. And this is when you build your community of practice. They’ll stay in there. They’ll continue asking each other questions. They will continue to build each other up. And suddenly you don’t have one or two people you vibe with because they care about the data. You have a whole group of people that cares about the data, uses the data, and answers their own basic questions. And that is so, so powerful. Suddenly, your flow of data looks like this. The data is flowing between them, among them, and back and forth to you. So data should be coming to you. There should be a dashboard where they go, hey, I noticed this thing. I need some answers about that. They’re asking questions about the data that they have retrieved themselves, and they are now empowered. This is a situation where rather than getting the really basic requests, you are going to start getting the good stuff. And I don’t know about you, but if I have to make a 50th SEO dashboard that’s really basic, I’m going to go crazy. If instead they say, hey, something insane happened that I can’t understand, and it’s going to take you two hours of in-depth investigation, yes, give me that request. Give me that request. That’s the one I’m going to be happy with. So let’s keep talking about user empowerment. How often do you offer intermediate or advanced training to level up your users? This is just some stuff I think is intermediate or advanced. I’m not going to go over it. But the reason I think this is important is, one, in and of itself, these skills are useful. You want people to be able to do these things because it’s helpful. Creating some of this stuff is really, really going to be beneficial for them to do their own investigation. But since we’re talking about culture, what it says is that this is a culture where data is important. My organization cares enough about data that they are willing to put aside resources and time to get us to level up our skills. That sets a tone throughout the organization that will drive adoption and drive increased skills. It really is about setting an expectation that data matters enough that we will give you time to get better at creating it. So let’s talk about your execs. Do they have access? Do they use it? They should. For me, a worst case scenario with data for executives is as follows. Data is exported from Adobe Analytics into Excel. A chart is made in Excel, and that data is copied to a PowerPoint that your executives see once a month. That is a bad situation for a number of reasons. First off, the reason is it’s a snapshot. It’s a point of time viewpoint. They get to look at the end of the month well after there has been any opportunity to take action on it. This means that instead of taking five minutes to look at a dashboard every day, every couple days, and seeing when something is going awry and they can start to intervene, it’s only after it’s too late that they actually learn about it. The other thing is that it’s abstracted. The data is completely abstracted. They don’t know where it comes from. They don’t have any relation to it. And that same issue of trust I was talking about before is even deeper because executives are busy. Now, I know executives are far too busy with far too many demands on their time to go to that 12 to 16 hours of training. If you’ve got that kind of exec, you’ve already got it made. You don’t have to worry about anything else. But you can get them in there for a couple hours, one to two hours, and teach them how to read a dashboard, apply some filters, and maybe do a basic drill down and change some date ranges. That’s all you want them to do. And they will do it. In my experience, they love to actually have access in real time because they want to feel empowered too. It sounds funny that the executives, the people in charge, need to feel empowered. When it comes to data, they do. And they will use the tool, but not if it looks like this. This dashboard is great for me. If I look at this, and mine are even messier than this, this is something where all my pattern recognition skills are going to pop up. I’m going to start going in. I’m going to have a great time with this. I’m going to look at this and start digging in. I’m going to make this even messier. But an exec will not. In fact, most normal people will not. Most of the time, their eyes will cross, and they’ll start hallucinating because there’s just too much data on here. So make it like this. Executive summary dashboards. I love this new visualization with these KPI ones. I love these. These are so nice. It’s such a happy new feature for me. But yeah, top three KPIs, a delta, and some trended data. Then underneath, put a little more detail, something where they can actually do a nice little drill down. And then they can really get an idea as to how to go into this, and they can ask a really good question. So if on the 13th of the month, they pop in and say, why did organic go down 12% this month? Then that’s a great question for them to ask. And they can go to the SEO team and start asking that question. And everyone can investigate while there’s still time to save the month. That’s where you want them to be. And at that point, we have this. All I did was add executives in there. But yes, it’s a really good thing when the executives are sending you questions about the data that they pulled. That’s a great thing. Depends on the executive. I’ve had some ones who were inquisitors, and that’s always fun too, but I’m prepared. But yes, now you have data flowing among all of your stakeholders to and from you. And the really cool thing is at this point, when the data starts going up to executives and between everybody, everybody’s really involved in it, and everybody’s really aware. It’s very easy to get data siloed in one place, which is why we have to ask some more data culture questions.

Do you have an inventory of source of truth dashboards? And were these all agreed upon? And does everyone have access? These are really, really key. And getting these agreed upon, particularly by executives, is super important. When you have data flowing with everybody, and everybody can make dashboards, that’s cool. That’s great. We want that. But because everyone can add their own little perspective on that, it can kind of be like when everybody in the room is talking at once. So having a set of source of truth dashboards that are agreed upon and have been approved by your executives, which gives them that stamp of approval, means that everybody’s reading from the same page. And while they may come to lots of different conclusions, it’s all going to be from the same data. And we can all be talking about the same thing. It’s really important that those are out there. That prevents it from being really siloed and make sure everyone has access to the same sources of data.

Now I’m going to talk a lot about test and learn programs, because test and learn is where it really comes down to the most important stuff. Now first off, do you have a test and learn program? And I’m not asking, do you A, B test? I’m asking, do you have a program? If this is somebody’s side of the desk thing that they do when someone asks them, that’s not a program. Is there structure around it? Are there intake processes? Are there processes for rejecting, requesting, updating, exporting, all that stuff? Do you have an actual program? What’s your process? How are you valuing these things for feasibility and priority? Priorization is so huge. Ask yourself these questions. And some of them, these can be double-edged swords. How many tests are requested? If your answer is none, it could be that you just don’t have a program that’s valued within your organization, or it could be that your process of requesting them is too hard. Take a look at that. How many are rejected? If the answer is none, is it that everybody is just asking for the exact perfect test and there’s priority for all of them? Or are you just not getting enough requests? And this is a big one for me too. How many changes that are not required by either law, regulation, compliance, are going out without being tested first? I will sound like a fundamentalist on this, but I believe the answer should be zero percent. I think that any substantial change going to your site should be tested prior to being released to production. Why? Because I have been humbled way too many times by test results that I thought were slam dunk. This is going to be the winner. There is not a bit of doubt in my mind. And oh my God, how did it perform that poorly? We don’t know. We have a general inability to predict the future. It’s just one of those things about being human. And so that’s why I think everything needs to be tested because I have seen some things that have made my hair curl. Next thing I want to ask is how are you drilling down deeper into your test results for deeper insights? I’m going to go over a little case study on this. This is a real thing that happened from a real test that I carried out at one of the best data-driven cultures I’ve ever worked at. So this is a company that has a large online retail store and they sell everything from like five dollar v-bucks cards all the way to high-end hardware. And so this was going to be a basic design test. All we were doing was changing the checkout process to be collapsed drawers instead of all expanded. And we thought this will increase conversion because our customers will know where in the checkout process they are easily. Straightforward. So speaking of results that surprised us, we saw a massive decrease in revenue. Huge. Conversion dropped, revenue tanked and it was insane. So great. We get a basic insight. Collapsed drawers reduce conversion. In the most basic AB testing program, we’re going to know that this is a design we don’t want to put out there. So great. Are you going beyond that? I hope so. Let’s go to the next level. So we looked into why these conversions decreased. Looked at some heat maps and found that the view cart button that was beneath the checkout funnel had been drawn up above the fold whereas previously had been below the fold. The collapsed drawers pulled it up and people started clicking on that like crazy. But it was kind of weird. They’ve always had view cart options. In the right rail above the fold up nice and high, there was always a view cart link. And of course in the menu in the header there’s always been a view cart link there. So why on earth were they clicking more on that just because it was a little bit more visible to them? They were primed to abandon a cart. There really was a mindset in them that they were ready to jump ship as quickly as possible. But why? Why was that? Why did having just that slightest prompt allowing them to doubt their purchase, why did that happen? We dug into the data a little bit because we could see the size of the cart and the items that were in the carts of people who abandoned. We found that it was not concentrated among the people who were doing their $5 V-Bucks carts. It was very much concentrated on higher end software, so the really expensive items. And we started figuring out that they were not confident in their purchase. This is a root cause. The simple answer would be abandon that design. Great, we’re not going to do that design. That’s really easy. But what are we doing about the root cause of the problem, which is showing the value and the benefits of the purchase? How are we letting the customer know that their high end hardware purchases are actually a really good purchase so they can purchase with confidence? That’s what I’m talking about when I talk about making sure that you have the deep insights. This is one example. There’s a lot of them. And sometimes it just depends on your audience. But each one of these tests that you run gives you insights into your audience if you look a little bit deeper. And when you start looking a little bit deeper, you can start understanding your audience way, way better. Those insights are gold, and they can help drive a design of your site that will be perfect for your customers and will really help conversion. And also, it’s just always a good thing to understand your customers because then you can serve them better. So you’ve got those cool insights. Well, what are you doing to socialize them? And more importantly, how are you institutionalizing them? Now, what I mean by socializing, I think we all know socialize the insights. So you have your regular meeting where you get together, you say, these are the tests we ran. And here are the results of those. Those happen every couple weeks. Those happen every month. Great. You need to have those. Those are really valuable. Those really do help propagate the insights throughout the entire organization. And those will help prevent repeats and things like that.

But that’s not really institutionalizing them.

Not all of the people who need to be at that meeting are going to be there. And if not all the people are there, there still is the chance that some of those insights aren’t going to go where they need to go. And it’s also a pretty good chance that those insights are not going to end up baked in to your site. So we need to institutionalize them.

Here’s where I talk about a quarterly insight conference. So you got all those results, you got all those insights, those deep insights, you know, your users better after a quarter of running all these great tests. You have all this information that needs to go to the people who actually design the site. Get your product team there, get your designers there, get your copy people there, and go over the results and talk about the deep insights. Show the data that backs up your hypotheses, and then ask them for alternative hypotheses. You’re not a designer, typically. You’re an analyst. But the designers, this is their job. This is what they do. They know better than anybody the impacts of design on users. So hopefully they have, they either agree with your hypothesis, or they have a better one. And then they back it up with you with data. At that point, revise the whole readout, give them credit. I think it’s so important to make sure that the credit is given to those designers, those copywriters, those product owners, because one, I think people just deserve credit. But two, it also shows the investment in data among the entire organization, shows where it came from, and shows that different organizations, parts of the organization are using the data to come to these good conclusions. It’s really, really powerful.

Then they have a task. And that task is to update the design standards.

If you have these insights and you’ve shown them with data, that means that the design of the site needs to change. They are going to update their web or app design standards in order to reflect what you’ve got. And that is going to go out and be baked into the site. Yes, I said test everything. But once you’ve tested and you’ve got those designs built into, you’ve got those design conclusions baked, build it into your site, make sure that that is where we start from. It’s not something we test towards. It’s something we’ve already tested. And now that is the baseline for design for the site. Now, finally, we get to our data-driven organization. This is what it looks like. We have data flowing, marketing, other stakeholders, executives, and products all flowing to and from our analysts. They are finding things on their own. They’re checking up on their dashboards. They’re looking at the source of truth dashboards. They’re finding anomalies. They’re asking you the analyst’s questions. And then you, the analyst, you’re sending data back to answer easy questions. But out of all of this analysis that is now collaborative and cooperative, you are getting hypotheses. You are saying to the Test and Learn team, I think these are great things to test. Your Test and Learn team, which is running a well-designed, thoughtful, rigorous program, is now testing those things. And out of all of those tests come beautiful insights, deep insights, understanding of your users and the ways in which the site can be made better to improve their experience. Because yes, we’re in this business because it is a business, but we can’t do it unless we care about our customers and make the experience better for them.

Those insights go to the design team. The design team then brings out standards, new standards that go to marketing and product that are baked into the design of the site, that are baked into their campaigns, and are going to have better conversion. And then wonderfully, as those are baked in, as we gain new understanding, as those things propagate, new data comes out again. We have to look at a new understanding of everything, going back and forth again in this virtuous circle of data, hypotheses, insights, and standards. At that point, your organization is truly data-driven. The design, the decisions are all being driven by the data, being tested, being very rigorous, and this is where we want to get to.

So how do we get there? How do you build this? There’s one easy answer, and that’s buy-in from leadership. If leadership really cares about this and is willing to make the investment, this whole process becomes a lot easier. I’ve given you a nice little outline, and if this is coming from the top down, it takes a little bit of time, but you can definitely build it. If you don’t have that, let’s talk about building it up from the grassroots. This is harder, and I won’t lie to you. I’m not going to pretend it’s not, but it is possible, and I’ve had to do it. So you have to go in, I say on the slide, be a helpful pest. Now, earlier I talked about that one product or marketing manager you really vibe with, that great person. You got to start with them. They’re the ones who are already interested in data. Help them out. Use the data with them to make them look good. Use your data storytelling skills. Start putting that data out there. Make the slides for them, and really make it look odd when someone else is not coming with the data. And then, for those people who aren’t, offer to help them too. It’s a lot of work. I won’t lie. It’s a lot of work. You’re going to do a lot extra, but it’s okay because what you’re slowly doing is you’re building up the expectation that data is going to be present. And as much as people like looking good, they hate looking bad. So when you see the disaster coming and you’ve got the data to prove it, if you can help someone stop something before it goes out and the disaster happens, that’s a really great thing. They will be grateful, and they will think twice before they put something out without asking you first. A little bit at a time, that really builds up the sort of trust between you and those people who should be using the data and starts building up that expectation. At a certain point, an exec starts looking around and going, that’s weird. That slide didn’t have any backing data. I wonder what’s up with that. And that’s one of the ways you can tell that the culture is starting to bubble up to those higher levels. I really think it’s important that you put dollar signs on Test&Learn. Again, the best data-driven organization that I’ve ever worked for, that’s what we did. We had a little model, and we caveated it because it wasn’t a perfect model. We showed how much was increased by the successful tests that we passed, and we showed how much money was saved by the tests that failed. And those are great. You put all that together. There was one quarter where it was about $30 million that we did. We caveated it, as I said. I think as long as you’re honest about the limitations of your model, it’s great. It’s also very useful for tracking the success of your program quarter to quarter. A really good sign, and it sounds funny to say it like this, is when an exec goes, well, you know, last quarter, your program generated $28 million in revenue, and this quarter, it only generated $20 million in revenue. You need to step it up. That’s a great sign, because instead of seeing Test&Learn and analytics as a cost center, they start viewing it as a revenue center. At that point, you know it’s really valued. All right. So what are the signs you’ve succeeded? Well, as I said, any request that the analyst gets should be unique. There should be something, a new product or something that’s coming out rather than something that’s been around for a while, or it should be advanced analysis. Again, they’re coming to you with the hard questions or the new questions, nothing basic and nothing that’s been asked before. Those source of truth dashboards, decision makers should be bugging you about them, not because they dislike them, but because they have found where they’re useful to them and where they’re not, and they need changes. Why is that important? Well, it’s important because it means that they’re using them, and they rely on them. At that point, that’s a really good sign that says, hey, you did some good work, and they want you to improve on it. That’s a huge thing. That’s a huge thing. Healthy backlog of tests. It’s everybody’s dream and test and learn to be perfectly resourced so they never need another resource to do things, but you should always be at capacity and have a backlog ready to go. That says that everybody is taking it really seriously. Everybody cares about it. And when you have to really prioritize, that means that it is succeeding. It is seen as a vital and useful part of your organization that has to be used to drive the organization forward. This next one, I love when this happens. I love when this happens. So as analysts, as data people, nobody knows data better than us. We’re the data people. It’s what we do. We know the data better than anybody, but we can’t know every product, campaign, as well as the people who actually run it on a day to day. When one of those people who does know the product or campaign or page better than you, corrects you, and uses the data to do that, oh my God, I want to jump for joy. When that happens, I’m like, yes, yes, they know the tool well enough to correct me. I don’t get mad. I don’t get embarrassed. I want to run over and hug those people, give them a high five and say, wow, you’re right. You’re 100% right. Thank you for telling me that. That’s amazing. And this last one, this is the biggest possible one. The sunk cost fallacy, throwing good money after bad, has the strongest stranglehold on humans that I have ever seen. It is one of the hardest cognitive biases to break. When you have done everything you can to optimize, you have tested, you have focus grouped, you have used all the data, and this thing just isn’t working. And someone says, you know what, we’ve done what we can, kill it. Let’s put those resources to something better. You have won. That is the greatest sign of a data-driven organization I can think of is when the sunk cost fallacy has been slain by data. And finally, CEO shakes your hand, tells the company to fall apart without you. It’s a bit of a joke, but the reality is I did work for an organization where every morning I would just be on the elevator, not every morning, but the president of the company would step in, and he did know my name. And we would talk, and we’d talk about data. And what I would say was so interesting to him, and he would be so concerned about it, that he would always get off on the wrong floor with me, and we’d keep talking until he realized what had happened. And he’d have to say we’d talk more and turn around and get back on the elevator to go to his office. So I really do think that execs interacting with analysts is a really key sign that shows that your organization is data-driven. And so that is what a data-driven culture looks like. That is how they get there and how to build it. And I am looking forward to your questions. Thank you, Gita. That was amazing. And as you can see from the audience chat, everyone also agrees with me. There was a lot of great learnings that you shared throughout this. But as mentioned, with that, we have a few questions that definitely come from the audience, so we will take those now. As mentioned, if you hadn’t had a chance to submit your question, you can do so using the Ask the Presenter panel. And if you have to leave early, please don’t forget to take our survey. It’s three questions, and it helps us select best topics for presenters in the future. So for our first question from the audience, we have, how do you encourage data-driven thinking for product managers when most product managers in the organization are already the data-obsessed type of people? So, yeah, that’s a hard one because they already think they know the data well. And I think that one of the ways to do that is really to work with them. You can go to them and say, fantastic, you’ve got this data, and I see how you’ve reached that particular conclusion. Let me show you a little bit of additional data that may add a little nuance to that particular conclusion. And then you really help them get into a testing situation. I think that most product managers, because they are data-obsessed, really do think that data is important. And so if you’ve got them in a situation where you say, oh, fantastic, let’s test that, that’s really where you get the opportunity to show them that maybe the data isn’t what they think it is, and here’s the thing that I think really drives that home and really makes them a lot more comfortable with where the data drives them. And that is, I get so excited when a test fails. So most people think about a test succeeding as, this is the result we want. Yeah, that’s the result we want, but when the test fails, there’s probably more learning you get out of a test failing than you do out of one succeeding. Where I get sad with testing is no static difference. If there’s no difference between test and control, I’m like, oh, man, we wasted all this time and we wasted all this effort, and there’s nothing that we got out of this. If that product manager, if you walk up to him and go, I have fantastic news, the test failed, the variant failed. Look, it did 6% worse. Look at all these really cool insights we’re getting out of that. Isn’t that amazing? Now, they’re probably going to look at you like you’re crazy at first, but that’s kind of okay because I think that data enthusiasm can be infectious. I will tell you, I can in a training get people to jump up and down and be excited about using Adobe Analytics pretty quickly because I just look at the smile on my face when I talk about failure. I project that onto people because when you say, this is great news, this failed, look at these deeper insights. Let’s talk about how we can incorporate that into the next test. Well, boom, you’ve dragged them along. It’s not a situation where they can just release it for one because you’ve got the data that says they can’t. And two, you’re talking about next steps almost as a fait accompli. It’s like, hey, look, these are the next steps we’re going to take. Let’s talk about these. Let’s work together. Suddenly, you’re a partner with that product manager rather than the sort of obstacle or the past. I’ve been there. I know how they can be. But yes, you might have to drag them along a little bit, but enthusiasm and encouraging testing and helping them to try that forward, that’s how I think helps move them from data obsessed to data driven. That’s great. And I love how you said being data enthusiastic can be infectious because I think we’ve definitely got that from your presentation. And I know I’m even very excited about the data. We have another question from Nina. In your experience, how long will it take to create a data driven culture going from data is scary and I don’t understand analytics to what you’ve mentioned months, years? I’d say if you’re doing it fast with a lot of dedicated buy in from leadership, about a year. About a year is the minimum I think it’s going to be. The reason for that is that the first three months are really going to be about building up data proficiency in your data, in your culture of practice, your community of practice. You’re going to have to get them all up, skilled and comfortable. And as much as you’ve got to support them after, there’s still going to be those times where they’re building that up. And then the other thing is that a bunch of the things I’ve talked about are cyclical. So I talked about a quarterly insights conference. One that first time is going to be really weird for them and not to stereotype designers too much, but they are artists and sometimes the data can be scary to them. So there is a little bit of process to build up their level of comfort with the data as well and their understanding of it. And then also to see that it’s a good process that’s helping them, that’s empowering them to change the standards in a really great way. You’ve got to repeat that process a few times. So the first one is scary as heck for everybody. And these new standards going out is really, you know, a lot of times web standards don’t even get updated as often as they should, but to have it on a quarterly basis to redo that, it’s a lot of effort. It’s a lot of work. But the first time around, it’s really scary. The second time, less so. The third time, well, we’ve already done this a couple of times. And the fourth time, this is just what we do.

So it’s going to take about a year if you have really good buy-in from leadership. If you don’t, it’s going to take longer and it’s going to be a struggle. And no matter what, there’s always going to be some backsliding because people are setting their ways. But if you have really good leadership help, about a year, if you don’t give it a couple years, I know it’s a long, hard slog, but it’s worth it because every single individual that you get buy-in from drives it forward a little bit more, reduces the waste, and increases the productivity and the user experience just that much more. It’s hard work, but you build it up, get the allies. It’s guerrilla warfare, but you can win. That is great and a very helpful answer, I’m sure, for most. We have a question coming from the designer side of things. How can I participate in data analysis as a designer inside a company to bring new opportunities and co-create them with organization stakeholders, not being a data analyst, but a designer needing data to improve strategies, designs, processes, not only quantitative data, but mixing in qualitative data from design research? Okay, oh I love that. I love that. I love that. Thank you. Thank you. You can tell I’m excited. I started actually started clapping like a three-year-old who’s getting ice cream. So yes. Okay, so I love qualitative research as well because qualitative research really is a great place where you learn the why behind the data. I’ve got all sorts of quantitative data and sometimes it is just a mystery to me. Like, I know something’s wrong. I know the behavior is weird. Why? I don’t know, but if I’ve got some qualitative research, userdata.com is one that I used to use where people would just talk through a process. That was so helpful because I’d be like, oh, oh, now I understand exactly what’s going on. This wording is completely incomprehensible to the user and I can play with that. So as a designer, I don’t think that you have to have the craziest data analytics skills to use the data that you’ve got. So for one, you can understand the conversion rate as well as anybody else and you can see changes in conversion rate as well as anybody else. So go ahead and get that training on Adobe Analytics. Get the good training. Learn to understand the tool so that you can pull the data. That’s fantastic because at that point you can just start looking at the conversion and the changes in conversion. You’re the designer, so you know the design of the site better than anybody and so you know when the changes happen as well. So when you see those changes in conversion happen, that’s just an opportunity for you to ask some more questions. At that point, there’s some collaboration that can go on with the analyst. Hey, I noticed that the conversion rate fell during this time. The design hasn’t changed. Did something else change? Fantastic. Yes, there was a big campaign that went south or we turned off display advertising during that time, so conversion just went down. Okay, good. You know it’s not the design in that case. Hey, the design did change and conversion went down. What the heck? Is it something else? Talk with your analyst, all of a sudden removing all the other sources and you know that’s the design in that case. Well, cool. That’s fantastic news. Then you work with the analyst to figure out what that is. You bring in your qualitative research because you talk to people and again, I think design is such a human-centered discipline that and you know I think analysis should be as well, but sometimes we get bogged down the numbers because the numbers are very friendly to us. So having that human-centered discipline that is designed combined with that, that qualitative data that you have, that’s an amazing collaboration. That’s wonderful. So if you just have the ability to ask the question, that alone is enough to get you in with the analyst and then you bringing in that qualitative research together, that’s a beautiful collaboration. Just the fact you’re able to answer the question is really going to help drive your design from both a data point of view and from an analytics point of view and all of that. It’s wonderful. That makes me so excited. I love, I’ve had really great designers I’ve worked with who were at the side, my side that, excuse me, too excited. Gotta use my slow down. I’ve had designers I’ve worked with who I really enjoyed working with who were at my desk every day. They were asking questions. They were like, hey, how did this work out? And working between the design team and the test and learn team, all that huge collaboration I think is so powerful. I’m so excited that a designer is asking that question because again, get the basic training just enough to be able to ask a question, just enough to be able to see the change. That’s you don’t need to, you don’t need a degree in data science. You’re not going to have to run a complicated algorithm or model. You just need to be well enough first to say something’s not right. Can you help me look further into that? Make it a collaboration. Play both your strengths.

That was great. Thank you. And our designer, Juliana says, thank you for the excitement in the chat. We have another question coming through. Many a time I have seen a lot of ambiguity in the data itself. Marketing or category teams are not even sure the business use case that they’re trying to solve. How do you propose we create a culture where stakeholders think through before approaching the data team? So that’s where we start talking about having a hypothesis. So if someone is proposing a design and they have to have a business case, there should be a question that we can answer with a simple change in conversion rate or click through rate or something. So the way I think about that is they are coming through and saying, we want to make this change. Asking the question, awesome. What do we expect that change to do? And how will we prove that with numbers is the basic question to ask there. Once they’ve got that, I think that’s helpful just basically because then in that case, it clarifies their thinking about why the change is necessary and why they’re willing to prioritize it. But it also does help to put numbers on it. Data sometimes is ambiguous and that’s frustrating. As I said, if I run a test and I get no static difference, that’s the most frustrating thing in the world for me. About the best thing I can say is that the change will do no harm. In that case, if there’s a brand reason or something like that, we can say go ahead. So if we’re in a situation where the business case is not well developed, that’s a really awesome opportunity for you to step in and say, well, cool. Why? And say, this is what we’ve seen. How do we expect the numbers to change when this happens? How would we prove that this change is working the way that you intended? I think that’s a really good question to ask. I think that they’ll appreciate that. Well, I think most of them will appreciate that. Some won’t. But provided you go in with an attitude of collaboration, it’s better received. But I think that that’s a really good question to ask to sharpen the business case and then also to put some stakes on the table to say, if this happens, we know you were correct. If it doesn’t, we can roll this back.

Great. Thank you for that. And I know we have a lot of questions in the Q&A. And unfortunately, we have only time for a few more. So we will see how many we can get in. But what would you do if an analyst believes a can report or dashboard is enough like a silver bullet where it can help and provide a full story and or insight to the marketers? Oh, prove it. Take that dashboard and prove it every single time. I’ve been there. I’ve got those dashboards. Those dashboards are great. I’m good at making dashboards. And my dashboards are pretty. Yeah. So take it to the show them exactly how you are using that, how that is a silver bullet. Data storytelling is a skill. And a lot of people with really strong analytical skills also have to build those storytelling skills because a lot of times the answer is glaringly obvious to us. I don’t even need to make a chart half the time. I just look at it and I know what the answer is because the data is in my bones. That’s not always the case with everyone else because that’s not their job. So use your data storytelling skills that you have developed and show them exactly how this dashboard works. Show them that this is the silver bullet. Here is exactly how we are going to prove out everything that I’m saying using this dashboard. Once you show them the value of that, that’s really all it takes. But again, that is us as analysts having to fight our own confirmation bias because again, a dashboard that is stunningly obvious to me. I’ve had enough times in my career where I go, well, this is the answer. And someone goes, why? And I go, look, look, it’s right there. And then I have to back up through five different steps of reasoning to show them that that really is the answer and show them the five other pieces of data that back it up because that sort of intuitive just pops into my mind thing is not inherently obvious to everyone else. So use your data storytelling, prove it out to them. That’s great. And that ties well to the example you shared during your presentation with the very overwhelming freeform table and then the much prettier, as you mentioned, version of it. So we have time for one more question. And we will share from Rose. How do you prevent people from sharing out information they interpret incorrectly and people starting to freak out? That is where the source of truth dashboards are really, really, really important. So yes, you want everyone to have access to that information. But if it’s not coming from the source of truth dashboards, you don’t really want that broadly disseminated. These are the ones that we’ve agreed to. These are the ones that have that. I’ve certainly done that and had to explain many times. Well, yes, your data is correct with the following 16 caveats. Here’s the filter you put on. Here’s the date range you put on. You’re in the wrong report suite. Also, that’s an old metric that’s been deprecated. And by the way, you are also looking at the wrong geography. So I hate having that moment because it’s not a collaborative moment or a cooperative moment. It’s a corrective moment and it doesn’t feel good. And it’s not a great moment to have. But sometimes you need to have it to stop the freak out. When instead you say, well, hey, I see how you’ve got that data. Let’s take a look at the source of truth dashboards. And we call them source of truth for a reason because that’s the source of truth. And you say, look, this is actually the situation. Let’s calm down a little bit. I see the points you’ve brought up. Let me look into those. And then you sort of tease it out in a more cooperative, collaborative way. I think that that helps the freak out. There’s always that opportunity. I’m not going to lie. You’re going to have to put out those fires occasionally. That’s one of the downsides of everybody using data. But it’s also an opportunity to teach them, again, why the data is that way. And it’s a learning opportunity for them as well. You go, oh, it’s fantastic. You use that data. Let me show you how to take that data to the next level instead of saying, this is terrible. Let’s talk. You take the opportunity to talk about like, oh, let’s look at it this way. Let’s enhance that perspective.

That was great. And I just want to say thank you again for such a great webinar. I know there were tons of bits of knowledge that I can take away from this immediately. And from the looks of the chat, a lot of people agree with me as well.

So before we go, I just wanted to remind everyone that this webinar was recorded. And we will be sending out a link to the on-demand recording tomorrow. We also wanted to share about a few upcoming events. The Experience Makers, the Skill Exchange is now going to be at this year’s summit. So join our team on March 23rd for the Skill Exchange customer learning event with sessions focused on sharing best practices and advanced tips and tricks like today for more of your fellow expert users. We’ve shared the registration link in the related content section of your council. So feel free to check it out. And as always, for more information about upcoming Adobe webinars similar to this one, you can check that out on the events page in Experience League. I want to say thank you again to our host, as well as all of you for joining today. I hope you have a wonderful day.

Thank you.

recommendation-more-help
76e7e5df-e11d-4879-957b-210b1656662e