Establishing an Effective Measurement Strategy
Adobe Analytics Booster Pack opening session hosted by Adobe experts where best practices for a measurement program strategy are shared.
- Elements of an Effective Measurement Strategy
- Common Challenges & Solutions
- Examples of Strategic Frameworks
Hello and happy Tuesday everyone. Welcome and thank you for joining today’s session. My name is Katie Yoder, Senior Customer Success Strategist with an Adobe’s Integrated Architecture team and I will be your host for today’s lead-off touchpoint for the Adobe Analytics Boosterpack series. Just to note, this session will be recorded and available for all who have registered for the Analytics Boosterpack series. So thank you in advance for attending today’s session focused on establishing an effective measurement strategy. Before we begin today’s presentation, feel free to post any cues in the chat pod. We will address what we can within the chat as well as discuss broader questions at the end of our session. Our presenter today is SK Nicholas, Principal Strategist within Adobe’s Integrated Architecture team. SK has a background in strategic consulting and supports both Adobe’s customer success and integrated architecture teams, helping clients achieve value from their Adobe product solutions. So at this time, without further ado, I will turn the meeting over to SK to take us through her recommendations for establishing an effective measurement strategy. Hi, thanks Katie. I’ll go on video real quick and introduce myself. I’m SK. I’m really excited to talk to you all today. Thanks for being here and making time. Let me go ahead and share my screen first. Here we go. Hopefully everybody can see that. Looks good. Can you confirm? Awesome. Thank you. Thanks everybody for making time. Today we are… This isn’t working. There we go. You know, it always works like that, that when you have a bunch of people on your team it seems to say it’s not to work. Today’s agenda, first we’re going to talk about what we mean by measurement strategy and how it’s a valuable foundation for business goals. Then we’ll go into really the elements of an effective model, those foundational pieces and prioritizing customer through a shared vision and a culture of data. Then we’re going to get into some of the finer details on how to build it, including action steps and considerations for you. At the end, we’re going to talk through a few resources that we have for attending today. Everyone will get a copy of this and the recording. First, what is a measurement strategy? An effective measurement strategy is a collaborative, objective-based and customer-centric plan that establishes a culture of data and continuous learning and creates actionable insights. It’s the foundation of personalized customer experiences and businesses that prioritize this measurement strategy have key competitive advantages in customer experience, business agility and operational efficiency. I know that everyone is here to talk specifically and hear about analytics. Today we’re going to talk a little more high level about what really needs to be in place from a strategic perspective to make sure that you can really get the most out of your Adobe investments. Including talking about vision and really the foundational pieces of a measurement strategy as listed here. When we talk about effective measurement strategies at Adobe, it’s really a cyclical process. So there isn’t necessarily, we’re definitely not thinking about this in a waterfall perspective. This isn’t something that starts with one team and moves through to other teams. This is a continuous learning, optimizing, understanding our customers and engaging with them. So you’re going to learn by gathering data. You’re going to build an understanding of your customers and optimize based on that. You’re going to respect your customers by delivering them experiences that are powered by AI and ML to process learnings and take informed action. And you’re going to impactfully engage with them. You’re going to be able to deliver better experiences to your customers across channels after you have a really foundational measurement strategy in place that prioritizes that customer experience. I can’t overstate how important having a measurement strategy is to a business overall. And like I said, I know that we’re here to talk about analytics and get the most out of those tools, but this is where we see some of our customers have real challenges in getting the most. So when you have that foundational strategy and everyone is aligned, you’re going to optimize customer experiences. You’re going to create measurable business value and improve your team performance. So the way that I like to think about this slide is this first column is really all about your customers. You’re going to be able to optimize that experience for them. And the second is really all about your overall business and creating value for that business. And the third is perhaps a little bit more selfish of a goal, but also very important, which is improving team performance and individual performance on teams that interact directly with Adobe analytics. Having that efficiency internally and also building confidence on teams with skill enablement and helping our teams get the most out of tools. The reason that overall we really need this foundational measurement strategy is that it is the key to personalization. We are living in a world where personalization is fueling every business and measurement strategy is not just tracking and reporting. It unlocks the ability to deliver that personalization at scale that customers don’t just want to but expect. So the bottom rung of this, which digital infrastructure, you get the tools in place to create, manage and distribute personalized experiences. That’s the basics. You have to have the tool that enables you to do it. And then you’re going to create this measure and learn strategy. So create this objective based measurement strategy to learn motivations, wants and needs. We’re really moving away from this persona based delivery method and basing this off of behaviors and actions that your customers are taking. And then the top is where you’re actually delivering those personalized insights and engagement. So acting on the insights to deliver real time personalized experiences for every customer on every channel. What are the risks if you don’t have strategy? This is something that we see a lot. We see people coming with these challenges where they implemented technology without the appropriate strategy in place. And these are some of the largest risks that we see and what we help to mediate with our customers. So inefficient practices are going to make you and me and lots of other people be in extra meetings, which none of us want to do. It’s also going to just have a lot more internal overhead and lead to less valuable insights, which creates higher costs overall for the business. Lack of actionable and accurate insights from the customer. This one is relatively straightforward, which is unhappy customers. And the third is without understanding and everyone kind of moving in the same direction, understanding what that North Star is, you develop a lot slower use case development. It’s harder for you to prioritize what use cases need to be enacted. It becomes more difficult for people to align on feature activation. And it really lengthens that path to value out of analytics. It makes it more difficult to enable those capabilities that lead to you getting value back out of the tool. Next we are going to move into the elements of an effective measurement strategy. This is where we’re going to get into some of the details of what makes a measurement strategy either very good or perhaps lacking in some of the foundational elements. So I would argue the most important here is that it starts with a shared vision that is across the organization, not specific to a business unit. Across the organization, there needs to be an understanding and North Star of where you’re headed and why. That will then guide implementation and prioritization. So starting with that shared vision, breaking it down into smaller pieces. One thing that we see a lot with measurement strategy, especially in tools like analytics, is across different teams, they’re really starting all the way down at the KPI level, which leads to some vanity metrics. It leads to KPIs being used to measure the success of teams rather than the success of an organization or an indicator of creating really valuable customer experiences. Another thing here is when you start with that vision, it ensures that different business units or different channels or different groups at especially really large organizations don’t have competing objectives. This can, I think we have all probably been in a room where perhaps the marketing and the technology team aren’t seeing eye to eye on the path forward to solve a problem. Making sure that they are sharing a vision and understanding what experience they’re trying to unlock for the customer really helps with that. The second here is customer focused. So the foundation of a great measurement strategy is just being obsessed with your customers, taking the time to prioritize customer experience, metrics and value for the customer. So making sure when you’re doing your roadmap planning, when you’re understanding what’s coming up, you are prioritizing what it unlocks for the customer and how it’s going to impact them. Now, of course, there are still going to be operational business metrics. We can’t overlook that. They are always going to be included in the measurement strategy, things like time to market or things like the cost of assets or something is what I mean by operational business metrics. And those, of course, need to be included to make sure that we’re making the most of the investment, but not at the expense of customer experience. They need to be balanced and think also about what experiences might need to be prioritized. This really leads to the idea that experience should be driving technology and not the other way around. When we’re mapping capabilities and features, we need to understand that those should be prioritized based on an experience. The capability shouldn’t drive how you’re interacting with your customers. And that’s really important because that’s what gives you your competitive advantage with your unique group of customers. The third year is around a culture of data. This is where the processes and roles and responsibilities and every aspect of your strategy should go in the direction of establishing a culture of data. People should expect that accurate and actionable data will be delivered on a recurring basis, that proactive data insight reviews and that has effective storytelling understand how it impacts customers. And this also is where people, if it is not effective, fall into the trap of reporting KPIs and reporting success in terms of teams and individuals and not necessarily on the experience that was created. The reviews should focus on action steps and iteration. And when you’re sharing out data, people should understand where it’s coming from, be aligned to the names of data, what these different metrics mean and how it impacts customers. All right, we’re going to dive into some details around what I mean by shared organizational vision. I think there’s lots of versions of this type of chart where we’re really tracking and mapping value all the way down from an organizational vision all the way to KPIs. But I think this is really important, especially when we’re talking about the application of specific tools to see how these things really track down all the way to an individual experience and then an individual metric. So the vision is the North Star for the organization. It establishes where the business is headed and why they’re headed that way. I think that’s very important because that why is crucial to being able to establish the strategic pillars. So these are the directional goals of the business. These are getting a little bit more specific about how we’re going to achieve the vision. On the next slide, I’ll take you through examples of each one of these. That’ll make it a little bit more specific, which might help, especially on these top two that are a little high level. And the vision and the strategic pillars should really be something that’s coming from executive teams and product owners. This should be given to working teams. It should be given to solution leads so that they understand how to break that into objectives. Understanding roles and responsibilities around this is super important. Otherwise I think it can become a little bit of a game of hot potato where people are expecting more information to be coming their way to act on. Then we get into objectives, which are the levers that need to be moved in order to support those strategic pillars and vision. This is really important because this level is what informs capabilities and feature adoption. Even until you have these objectives, you can really appropriately roadmap and plan use cases. Now this is an area that becomes a little complicated with the implementation or the growth and maturity for any technology because a lot of times, even us, we tend to jump right into use cases. That’s really an application of technology. But you need to understand what that technology is trying to unlock and be aligned across the organization on the vision and pillars and objectives in order to appropriately prioritize those. And then after you have roadmap those use cases is when you get into the level of KPIs. And KPIs should strictly be metrics that help you measure if you’ve successfully impacted your objectives and vision through a use case. KPIs should not be used to necessarily measure the success of a team or individual in the way that we’re talking now because even if the KPI is low and perhaps doesn’t achieve what you thought it would, isn’t as high as the number that you wanted, it is still giving you really useful information about where you need to iterate on that use case. How can we take it even further or what prevented it from reaching that number? And that’s where you get back into that learning circle that we were talking about earlier, that these should be used to inform next steps and iteration on the success of an application of a use case. So that was all very strategic. Let’s get a little bit more specific. This is completely made up information about a sporting equipment store. So the vision that came from executive leadership is that they want to get the largest market share for purchases of sporting equipment. Again, this is a retail example because I think it’s the simplest perhaps to map to this for understanding. And then also coming from executive leadership would be that in order to get that largest market share, we really need to create a best in class customer experience. So that’s where we’re getting a little more strategic, a little more directional. There would absolutely be more pillars here. This is one single example. There would likely be three or four. Perhaps the groups that you’re in won’t actually touch all three or four of them, but it would be identified which ones they are expected to impact. And then across working teams, probably marketing teams with an objectives, you would break that best in class experience into, well, what’s a way that we can create a better customer experience. In this case, this customer wants to improve their reliability across product recommendations and the required accessories for a product. So this is something very specific that they can do that will create a best in class experience in order to grow market share. And then it becomes a use case. Then it becomes, how are we going to apply this? The specifics of the use case are, well, in order to do that, we need to gather information on products that are frequently purchased together. That’s a report that’s already available. Perhaps it’s something that you’ll need to go get. You’ll also be looking at required accessories for certain products and add a reminder in your cart before a customer checks out something like, are you forgetting something or these products are usually purchased together? And how are you going to measure if that works? Because there’s lots of other ways to improve reliability across product. This is one possible way to accomplish that. How are we going to assess if that specific application worked? Well, we’re going to see if it increases order size. We’re going to see if people add the recommended products to their cart or if they have to create follow-up purchases of those related products. We can look at customer satisfaction. Another one here you could do is look at returns. And then you’re going to track that behavior and see where they’re clicking when that messaging is displayed. Do they leave checkout and go add the other product? And I think this is also where in the use case, you could get into testing. You could understand if the checkout is the best place for that. Should they add it to the product page? There’s lots of other ways. And then the KPIs are going to tell you if that application is successful.
The next pillar we’re going to talk about is around being customer focused. This is really a tremendous shift in the last few years on what it means to be customer focused in the world of personalization. As we’re moving towards truly one-to-one personalization with unique user profiles, we’re not relying on demographic-based personas to inform messaging. It’s not necessary. And they tend to be a little light and or based on demographic information can easily lead to biases and stereotypes. Instead we’re going to flip that around. We’re going to look at specific behavior of a user and a customer across all of the channels, create that comprehensive profile that establishes what their motivations are. They can tell us what they’re looking for in order to deliver targeted messaging. So where previously personas would dictate a user group and dictate messaging, we’re really flipping that around now. And we’re looking at actual behavior from customers that then are used to form segments and create targeted messaging to create more accurate groups, to create more targeted messaging, and to make sure that we’re really again, striving towards that unique customer experience. One application of that, this is a feature and capability prioritization maturity that is based around the idea of customer centricity. So as we’re looking at roadmapping new feature adoption or capability prioritization, are we prioritizing the customer first? It makes sense when you have a new tool or you’re still low in adopting a tool that you would prioritize those features and capabilities based on internal metrics, things like feasibility and desirability and speed to market. But that does give you a limited ability to really impact customer experience because you’re focused on kind of just getting the technology stood up. And then there’s real risk if different internal groups are using different metrics. That’s where you get some of that push and pull between marketing and technology teams. As you grow in maturity with these tools, you want to prioritize that adoption based on more and more on customer centricity. So at the crawl stage, you might still be working mostly with internal metrics and value, but focusing on maybe really high value customers. You still have a limited ability to impact that customer experience, but you’re starting to kind of add a little bit more clarity towards the customer. And as you move through walk and get to run, you’re going to create a lot of consistency and experience for customers as they move in their journey. It’s going to be more targeted to them. And these capabilities are going to be easier to map as you move forward because there’s alignment across teams about the crucial pieces that you’re trying to impact.
The last pillar here is a little bit of a joke, a Mark Twain quote for you around the culture of data. Data is like garbage. You better know what you’re going to do with it before you collect it. I love this quote. I think it’s very funny, but also very accurate for the world that we’re working in. We want to be thoughtful about where the data is going, how an organization is accustomed to seeing it, where it’s intended on being used. So this is, again, going back to that point around making sure that your culture of data is focused on next steps and innovation and iteration on customer experiences. And that’s where it’s going to go after you collect it. If you are collecting it and just gathering KPIs or getting a series of metrics for KPIs and it becomes kind of a list report, it’s very difficult to make actionable. It becomes more of a success metric for teams and individuals rather than directional information about what’s working and not working and the way that you’re building your business. Next we are going to get into how to actually start building a successful measurement strategy. We’re going to start with some of the common blockers that we see in implementing measurement strategy. I think these are true across the board in organizations that are trying to move in this direction. The first, I’m sure will not be a surprise after I talked so much about vision and metrics and goals is organizational misalignment. Without a shared vision, you’re going to have silos. You’re going to have competing priorities. There are always going to be different ways to solve a challenge. Make sure that you’re moving in the same direction, that you understand what you’re trying to solve when different groups come to the table together so that you can truly like sift through those different ways to solve the challenge rather than arguing about what the challenge is. I think that’s so important because it redirects the conversation to solving problems together rather than disagreements on what the problem is. The second is gaps on team. I think this has become an even larger challenge and more prevalent in what we’ve been seeing recently because there’s been a lot of turnover on data and insights teams. That’s just the culture that we’re in right now. Without a lot of proper documentation and with a lot of people changing roles or changing teams and really growing in their career, there’s a lot of customers that are missing out on capabilities and value. Making sure that you’re giving your team the tools and resources they need to feel confident in the solution as you get new team members or as you identify gaps in what is needed to complete the task. Make sure that you’re documenting that and curating enablement and getting the resources that are needed. The third, everyone’s perhaps least favorite word, governance and documentation. Governance and documentation, it becomes very disconnected. People are going to use tools in different ways. They’re going to use reports to mean different things. They’re going to understand the definition of metrics differently. It becomes a lot easier to actually act on these insights if some of those details are sorted out upfront. The fourth here is around data storytelling. It’s an effective framework for sharing insights. Without that human aspect of telling a story, it can fall flat or be hard to act on. Executives or marketing teams or any other business group might see a report or a data visualization without key insights pulled out or without understanding the impact and not really know where to take it. It can fall flat. They have different priorities. They’re looking at different things. So ushering people through that story can have a lot of impact. So consider implementing data storytelling training or frameworks for how reports are going to be delivered so there’s an expectation around that to gain momentum and buy-in. It’s going to be really important that you get other people to understand the value and impact of the data.
So the common blockers, if you flip those around, are also the first steps to implementing effective measurement strategy. These are kind of the very first things that we need to do. So the first thing we’re going to do is define that vision and objectives. So if you are perhaps a user of the tool or a product owner, it’s really going to be your responsibility to get some of that executive alignment, explain how important it is that as an organization you have a clear vision and strategic focus areas so that you can build out and map use cases and capabilities that will inform capability mapping, prioritization, key performance indicators, and it will also help you tell a story about how great you are at your job. You can show how you’re actively getting closer to that vision and impacting major strategic objectives for the business. The next is around assembling a team, taking a moment to assess if there are skill gaps or resource gaps on your team and creating a plan to address them, making sure that people have ownership of tasks and that all tasks are covered. This is going to do a few things for you. One is it’s going to inspire teams. Having ownership and having something that you are really responsible for at an individual level can be really motivational and inspire that confidence that you want to get even better and more mature with analytics, and it also ensures accuracy. When someone is responsible for owning it, they’re going to be more likely to make sure that that data is right. There’s tons of enablement tools available through Adobe. We’ll talk about some of them, and I have a resource model to share with you as well in a few minutes. I think what’s really important here is also the assessment phase, where you’re really taking the time to talk to the team and see what those skill gaps are or resource gaps and making sure that there’s at least a plan to cover those because, as we all know, a lot of people changed jobs in the last few years, and there are gaps that in most organizations right now that need to be looked at. Last is around telling a story, so creating a framework for sharing insights that prioritizes the customer and shares insights as a story. We are humans. We react the best to understanding human impact, and the data should be giving you the tools to tell that story better. It is a better tool to hone human intuition, to validate human intuition, to help us be able to tell a better story and back up what we are thinking. This is really around going back to a hypothesis and testing approach to capabilities and to measurement. This is something that naturally not everyone who’s super numbers-focused has as a skill set. It’s something that takes a lot of time and commitment to get really good at, and giving them a tool like a framework or a training or something can be really helpful to getting that buy-in across the organization. I’m going to dive into a couple of these a little bit more. The first around building a team. Again, you’re going to assess the skill gaps, curate enablement plans, document skill requirements, define clear roles and responsibilities, define a check-in schedule for new features and capabilities, encourage curiosity, risk-taking, and innovation. I think two of these are really important and also overlap with the governance aspect is the documentation of skill requirements and clear roles, allowing people to see, hey, this is really what’s expected, here are the skill requirements, and allow them a place of honesty where they can say, these two aren’t my strengths. What are the tools that I can do to get better at those? And also so that when you’re bringing in new talent, it becomes very clear what lies within your responsibility for clear roles and responsibilities. There’s new features often, so making sure that there’s a reminder on the calendar for the product owner or the team to take a look and see if there are other features and capabilities that could impact those strategic business pillars. And then the last one I think is really important around organizational behavior and change management, which is encouraging curiosity and risk-taking, encouraging innovation, making sure that individuals understand that if a KPI maybe isn’t where they wanted it to be, that that isn’t necessarily a failure, that it means they learned what doesn’t work, and then they can pivot that and iterate on it to improve upon it so that they are more willing to try new things to impact that customer experience. I’m going to talk next, I’m going to get a little bit more specific around assembling a team by talking about some of the key roles. These first two are on their own slide because they are that important. The executive sponsor, this role is so critical and a lot of times the analytics owner and executive sponsor kind of get squished into one role, which is a really challenging place to be. The executive sponsor needs to be defining those long-term goals, they need to allocate resources, build partnership, and they’re really responsible for sharing and disseminating that objective and vision both internally and externally within their team and across the organization. It guides implementation decisions, adoption and usage, and being able to kind of put a finer point on how analytic strategy supports the overall business. They should also be responsible for getting feedback and building trust, and this is really a main stakeholder, a main person for building that culture of risk-taking and innovation and trust. The other role that’s super important is the analytics owner, so they’re fundamental to the success. They’re the key contact for executives and business partners. They own analytics within the organization. It’s really the one person that is the owner of the tool itself, so they focus on the corporate level issues while maintaining visibility. They’re the driver of cultural change and product adoption. They would be responsible for creating that training plan, so for documenting the skills requirements and making sure that individuals know about what enablement tools are available to them. They’re going to look for opportunities to use analytics in new and interesting and growing ways, and they’ll likely, though not in every organization, be the manager of that core team really helping them support the organization’s KBO, so they’ll be at that middle stage of defining objectives and coming up with new use cases and applications of the technology. I’m not going to talk about all of these in detail. This will all be provided to everyone at the end of the session today, but these are four very specific roles and what they are responsible for that are pretty crucial to the upkeep and success of the tools, so this is kind of a good checklist for you as you’re building that team and as you’re doing the assessment of what might be missing in terms of responsibilities or in terms of skills. This is kind of a good place for you to start and make sure that you can kind of check the boxes on these and make sure that it’s covered. I’m going to talk about governance and documentation next. This is going to be a little more general than the assemble a team because I think depending on the organization, depending on team structure, depending on size, this can look really different, but at minimum, here are some of the things that really need to be considered. One is establishing that meeting purpose and cadence. This is around creating that culture of data where people know when to expect that reports will be shared, how they will be shared. It shows up on their calendar. They understand exactly what it’s for. They know that they’ll be receiving insights and that that happens really consistently and dependably through the organization. You’re going to document those rules and requirements. You’re not going to let people argue over the rules and requirements and start kind of handling things differently in different organizations or different business units as much as possible by documenting these and making them very shareable and available. Don’t hide them on some internal SharePoint or something where they’re difficult to find and hard to reference. Making them very easy for people to recognize so that they can adhere to those rules as much as possible will really help. Clearly documenting metrics definitions, even if you think people are on the same page by what click through rate is, make sure it has a definition. Make a cheat sheet so that as you’re sharing reports, people know exactly what they’re looking at and what it means. This is the same thing for nomenclature across teams and channels. A lot of times analytics will be handled completely differently across different channels or across different business units. Making sure that kind of using the same language to describe metrics and to describe your strategic approach or at least making sure that people know both terms and there’s an alignment there. You’re also going to document processes and build in adherence standards. Making sure that the processes for pulling reports or tagging, basically everything associated with it is documented and that you’re building in some sort of standards to make sure that that is followed, whether that’s in the tools itself, whether it’s in user roles and responsibilities, but making sure that there’s at least some sort of check that people are following those rules. Oh, there was one more. Developing a formal tagging intake process is a huge portion of this and that one is more specific than some of these others, but something that we see missing quite often as requests come in from different groups across the organization for tagging, making sure that they understand what information needs to be provided in order for it to be successful. All right, now we’re going to talk a little bit about tell a story. So I think we all probably saw this in grade school in a very different way around a story arc and the rising action and climax of a story. This is also how we need to think about sharing data and sharing what we have measured. The beginning of these reporting meetings that you’re going to set up a great cadence for is setting clear expectations, understanding exactly what’s coming right at the beginning, and forming of methodology and the data highlights. This is where you start to pull out some key pieces. You’re starting to tell that story. You’re starting to weave them together. I think what’s really important here is also you want to share meaningful customer insights or a meaningful customer story. Maybe this is through a user test or a specific experience or a statistic about what a bunch of customers are doing. It’s going to be really important for you to pull that in and talk about the customer impact here, not just through a readout of data points. And then there’s the big moment where you give the key takeaway and the impact that it is creating. This is where you’ll likely pause and allow people to ask questions, understand the impact, and then you’ll share recommendations for addressing that key takeaway and the impact it will have. This is what we’ve learned, and here are some ideas about how we can either improve it even more or correct what we found as a challenge. And at the end of these, you’re always going to collaborate on that work breakdown. Make sure that those key takeaways, those recommendations have set expectations for who is going to handle it and how, and remind people of when you will be sharing the next story. Some other key points for how to lean into data storytelling, keeping it simple. Again, defining metrics as needed. You can tell, I think that’s very important. You need to make sure people know what they’re looking at, making sure to have all the right people at the table. This might look different depending on what those key takeaways are. Leading with that customer perspective and the impact that it has on your users and making sure to include customer impact and business impact. So this is how it impacts our customers, and because it has that impact, this is the impact it has on our bottom line and our business. And like I said, identifying clear next steps and responsibilities. And documenting needs or blockers. So if the next steps required or the next report or anything that’s identified in that work breakdown, you have additional needs or things that are blocking you from being able to complete that, make sure that that’s included here. And make people guess on what those needs are. And aligning back to the vision pillars and objectives. So as you’re sharing those takeaways, make sure that it harkens back to this is how it impacts this objective and this strategic pillar and how it gets us closer to the vision. All right, I’m going to take you through just a couple quick resources and then we will have some time for Q&A. The first is a maturity model. This is good as a reference point for how you’re doing in rolling out a measurement strategy. Documentation, data collection, compliance, data value. We’ll talk about how to measure use cases, your architecture and accountability. This is a good way. You can even use this where you have different users fill this out of where you think they are and compare results. Or it’s also easily used to set goals of where you want to get to and by when. This is just a reference that everyone will have access to. We’re also going to have a measurement strategy checklist that gives you just some basic questions of how you’re doing in creating that measurement strategy and in prioritizing upcoming initiatives. This is great for team meetings where you’re trying to look forward and trying to plan the actual execution of a use case. The last slide under resources is just indicators for a need for measurement strategy. These are things that we hear often from customers that really say to us, this customer might need to focus on some aspect of creating a measurement strategy. Something is missing and preventing them from really being able to move forward and get more value out of their solutions.
I’m going to stop sharing for just a moment and pull up one other resource I want you all to have that will be sent at the end, which is an executive success guide here. This has really great information in it around sharing your vision, around assembling teams, around governance pillars. It’s really a great tool if you are trying to get executive buy-in or trying to really have more impact across the organization and building the right talk track to get that for you. Okay. Going to stop sharing. I am going to also recommend, Katie, that we go ahead and stop the recording for Q&A and give it over to the team to see if they have any questions or would like to go back and focus on anything specific. Thanks for your time. Definitely thanks for what you walked us through today, SK. I think the depth of the materials there is a lot. I look like everyone is still processing that as we haven’t had any questions come through. But as you begin to digest what SK has shared with us today, feel free to reach out to us, your Adobe team, and we can follow up from there. Thanks again, everyone, for joining today. We look forward to connecting with you further during the Analytics Booster Pack series that’s taking place throughout the month of August. We’ll be in touch, and thanks again for your time.