Top 10 Digital Analytics Doctrines to Live By

In this session, Ash will share 10 Adobe /Digital Analytics management strategies and discuss problem statement, common practices, and provide an effective solution for each.

Transcript
Hello everyone and thank you Adobe team for giving me this opportunity to address this curious group of digital enthusiasts today. I’m here to share the top 10 doctrines or guiding principles that helped me as a digital analytics team lead. The topic of today’s session really pushed me to think through the experience I’ve gathered over the past decade or so. I was required to focus on what really mattered the most and I had to find the most important rules of conduct that could really move the needle while determining the success of the team. So in a way, the next 15 minutes will really be about my approach to ensuring that digital analytics teams become and continue to stay relevant to an organization. Now I have spent almost two decades in the digital space as a professional and I have been closely associated with the operations of a fairly wide range of companies. In my experience, while organizations are at varying stages of digital maturity, they mostly always follow a similar path to it. Initially, there is a sense of denial on the magnitude of value that digital offerings can add to a business. Thanks to COVID though, very few businesses can afford to be in that zone now. The beginning of acknowledgement of the need to become a digitally savvy organization to me is the second stage of digital maturity in the life cycle of an organization. As the interest in digital grows, organizations start to realize how their initial digital efforts need to be recalibrated and expanded to yield digital economies of scale. This is stage three in the chart here. This stage is usually characterized by an organization-wide demand for digital data. Following closely at the heels of the stage is stage four of maturity and complexity. With high demand for digital data and changing management, resources and tools, teams often find themselves sandwiched between a need to churn high volume of complicated data, deal with high expectations from stakeholders, and deal with process or data gaps all at the same time. How teams manage themselves in stage four will then determine whether they will stumble over some archaic, less than optimal processes and legacy issues, or whether they will reinvent themselves and realign with the higher expectations of their organization. Since this session is part of the grow track, I hope most of you here today are somewhere in between stages three and four, and that you can resonate with what I just described. So before proceeding further, let’s take a moment to understand what really is the goal of a digital analytics team. Now to me, a digital analytics team has two main goals. Number one is to increase the adoption of the insights that it generates throughout the organization. This can only happen if stakeholders across the organization unanimously agree on the value generated by digital analytics in positively impacting business KPIs. Number two is to maintain the trust and confidence that the larger organization places in the digital analytics team by consistently providing high quality reports at speed and at scale. The 10 doctrines that I will now jump into are categorized into two parts and speak to each of these two goals. So goal one is increase adoption. Doctrine number one that speaks to this goal is categorize your stakeholders. What do I mean by that? Digital analysts may get queries from product managers, marketing leads, developers, testers, customer care officers, and even finance team members. Understandably, it can be a little overwhelming for an often small digital analytics team to manage multiple requests coming from people in different parts of the organizational hierarchy and to stay relevant to all of them. The way I approach solutioning this challenge is through categorization and differentiation. When an analytics request comes to my team, we try to find out more about the stakeholder. Why do they need the data? How does it impact their KPIs? What kind of similar activities does the data compare to? Based on the response to these queries, my team is able to categorize stakeholders and decide important aspects of the delivery like granularity of reports, frequency of reporting, and what kind of benchmarks will be meaningful to the stakeholder. Accordingly, then, my team is able to also categorize the outputs depending on the kind of stakeholder we are dealing with. For example, if the stakeholder is part of top management, detailed reporting is probably redundant for this person. Instead, it might be more useful to provide a feed of digital data into the company MIS system for regular automated updates on digital KPIs. However, if we are talking to a campaign manager, we might need to provide quite granular level of reporting, complete with specific comparison to previous campaigns or even to competitor campaigns. It might make sense to create dashboards for certain stakeholders who need similar data at a very regular interval. These dashboards can have inbuilt filter options that enable the stakeholder to pull historic data automatically without depending on the analyst. So this slide provides one of the many ways to categorize stakeholders. There can be many, many ways to respond to the very needs of stakeholders. The solution that you eventually adopt will, of course, be very specific to your organization. Moving on now to doctrine number two, which is to limit your metrics. What do I mean by this? So we are all aware that Adobe analytics alone allows the creation of 100 custom metrics and an unlimited number of calculated metrics. And this is just one Adobe analytics. If your organization is present on social media, you will know the range of metrics available on each of the social platforms. I wish I could meet you in person right now to ask you if you have ever joined a digital team as a new recruit and come across a legacy of possibly redundant metrics that are never really applied to solve real business challenges. Because I know I have in every single organization have joined so far. It doesn’t take a lot of imagination to gauge how quickly this could become an operational disaster as teams mature and the scale of deliveries grow within the team. Because every custom or calculated metric will need maintenance, often across multiple platforms that don’t necessarily have the same layouts or text stacks. And it is a good idea, therefore, to use a finite set of highly random metrics. Working with a wide range of metrics is hazardous, not just from a governance point of view, but is also counterproductive from an adoption perspective. And that is simply because we are all actually quite forgetful. The slide here explains what I’m referring to. Hermann Ebbinghaus was a German psychologist who founded the experimental psychology of memory. According to his research findings, generally, we forget about 60% of what we have just processed within the first 20 minutes. This effectively means that the more metrics we introduce, the more difficult our reports will be to comprehend, remember and execute upon. And the less likely our organization will be to adopt our insights. Therefore, rather than increasing our range of metrics, it makes more sense to work off a limited set of core and support metrics. With regular use of these metrics in various digital reports over a period of time, the chances of acceptance of these metrics in regular day to day business language is high. And that in itself is a determinant of the high adoption rate of digital analytics in the organization. Doctrine three is about having a finite set of dashboards. What do I mean by this? Well, just like a large list of metrics, I have often found digital analytics teams struggle with a high volume of high maintenance dashboards created by legacy teams. Often these dashboards are rarely used for active decision making by leads and over time analysts lose a sense of purpose or fulfillment in keeping up with this legacy load. Having access to a high volume of dashboards can also be confusing for the user who might not be sure which dashboard to use for a particular query. It might also be frustrating and time consuming to wade through a lot of dashboards to dig around for relevant data. This defeats the very purpose of a dashboard. Therefore, it is critical that dashboards are relevant, limited in number, easy to access, understand and apply. So why on the one hand fair amount of thought must be put into creating a limited set of dashboards that can be used for a wide set of business queries? Equal amount of effort must be applied to encourage active adoption of these dashboards by stakeholders. Thought and time must also be spared to retire dashboards that have become redundant because of shifting business focus or change of stakeholder requirements. Doctrine number four, don’t get siloed. Another death trap for us digital analysts is the tendency to dedicate all our work hours to create these beautiful high quality reports but to lose touch with the real business in the process. No matter how close we are to the report generation process, it is critical to get feedback on how these insights are ultimately being used by the business. Now trust me when I say that I know just how much goes into ensuring that all regular and ad hoc deliverables are generated on time and in an error free manner. The analyst’s life is straggled not just between report generation and stakeholder management but a fair amount of work also goes into simply governing, auditing, bug fixing, automation and several other BAU tasks. But trust me also when I say this, the stakeholders who receive these reports are also straggled between the need to decipher the insights from the report and attending to business as usual. Information can be misunderstood and worse still, the key message may never land with the stakeholder. Which is why and precisely because the analyst has already spent significant amount of effort in generating the report, it befalls the analyst to also try and ensure that the key insight from the report is understood in the correct perspective by the stakeholder. Shining the light could also mean actually spelling out the insights including application areas and business. In case a discussion on the report with the stakeholder is difficult to arrange. And the analyst will only be able to add this kind of value to reports if the key information in doctrine number one, remember in slide six a few slides ago, has been gathered beforehand about categorizing your stakeholders. So you can see how each of these doctrines tie into and feed off the other in some ways. Doctrine number five, train your stakeholder. What do I mean by this? The organization is never static, team members, responsibilities and business needs, they all change from time to time. Analysts who are closest to the analytics software and the data might feel that their reports and dashboards are quite intuitive and they might well be right. But the stakeholders focus is not just data. The stakeholder is focusing on regular operations and a host of other things. Often a report or a dashboard may not be used to its fullest potential simply because of a lack of understanding of the report or the dashboard and its applications. Regular training initiatives help analysts overcome this challenge with stakeholders. By regularly training stakeholders on multiple aspects of reporting and analytics software, analysts not only increase adoption of their reports, but also reduce over dependence of stakeholders on analysts for simple tasks. And this frees up analysts time for higher value activities. So remember Ebbinghaus’s forgetfulness curve shared earlier? So this chart is a corollary to the forgetfulness curve. It shows how regular repetitions help improve memory retention. The five doctrines shared with you up till now are essentially ways in which we can use a finite set of metrics and dashboards repeatedly in regular, bi-level, business conversations to increase adoption of these reports in the organization. Stakeholders may also need some help with application of the insights and business challenges, but soon they will catch on to the drift. And before you know it, digital will be an integral part of the organizational culture. Let us now look at the second goal of digital analytics teams, maintaining credibility. For businesses to take decisions based on analyst’s report, these reports need to be error-free and on-time every single time. Poor-quality reports reduce the credibility of analytics teams. This credibility can take months to recover. So the next few doctrines I share are meant to help mature teams in continuous delivery of high-quality reports, as well as in maintaining their credibility. Doctrine 6. Govern your segments. What do I mean by this? As we all know, segments are ways to filter data in Adobe Analytics. These segments can often be composed of pretty complicated conditions. I have seen segments that have more than 50 conditions. In larger organizations where the analyst has little control on the web structure, there might be very little in the analytics team. Add to this the fact that the segments used for some important high-visibility dashboards and reports could have been created by team members who no longer manage it. The reasons behind using these conditions could simply have been forgotten in the organization’s collective memory. A newly appointed analyst would be tasked to regenerate a report without full understanding of the backend mechanics behind it. Hence, the validation of the report may be problematic. To avoid such a scenario, it is important to govern at least the core set of segments that are used for generating the most high-profile reports. I have listed some of the many ways in which this can be done. We can maintain a list of core segments. We can do periodic checks of conditions used in the filters. We can do a list of the most high-profile reports. We can do periodic checks of conditions used in the filters. We can request product managers to proactively communicate for timely information on digital property changes. It is also important to give this task its due importance by scheduling a time for regular segment quality checks. Doctrine 7. Govern the Tag Manager Again, in mature digital analytics teams, the tag manager is likely to have been used by many people who are no longer part of the team or manage the same responsibility. Responsibility for managing the tag manager could also have shifted around between teams or even between an agency and the brand in the past. It is likely that when a new deployment specialist inherits the tag manager, the tool hosts a bunch of rules that are redundant. Redundant rules can not only make the new deployment of functionality and testing a nightmare, but it can also slow down the website performance considerably. Not to mention the additional cost an organization may be raking up in redundant server goals. So unpleasant as it may be, the housekeeping of the tag manager is an essential task that must be undertaken diligently. Doctrine 8. The next doctrine is again for mature teams that have inherited high visibility reports and dashboards that have legacy data going back months and years. Keep a log of data adjustments. Since the structure of the digital properties keeps changing due to changing navigation, features and technology comparisons between old and new versions of the same digital property may not be as easy as it sounds. There might also be tagging issues or bugs that could cause temporary gaps in data, inflation or deflation in data. Depending on the value of the dashboards and the kind of business decisions being made on the digital analytics, some data adjustments make the necessary from time to time to cover for these data gaps. It is prudent to keep a record of these data adjustments or these anomalies. At the same time, it may also be wise to set expectations with stakeholders on how accurate the digital data is expected to be. Digital analytics data has many advantages like speed, scale and granularity. As a corollary, due to these very qualities, it may not be 100% accurate and it is important that stakeholders realize this. Doctrine 9. Plan for resource movements. Analysts are an eager bunch looking for accelerated growth. Resource movements are all too common. Unfortunately, project continuity is more dependent on human soft skills than we would like to acknowledge. Frequent resource movements impact delivery, communication, team comfort levels, behavior and culture. So the best way to be prepared for this inevitable event is to have backup. Checklists, documented processes, email backups, having a bifurcated Spock responsibility. All help towards ensuring human resource movements inside the digital analytics team are as seamless to the end stakeholder as possible. In the same way, the next and the last doctrine for today is emphasis on automation. One of the ways to maintain the attractiveness of digital analytics roles is to ensure continuous learning and to reduce repetitive tasks. Repetitive tasks are best left to software, while intelligence human resources can dedicate their time to higher value-adding tasks. Automation also reduces the possibility of errors on reports and helps ensure high quality of reports. The best place to start is usually those tasks that consume maximum time. It is also a good idea to consciously revisit existing processes to see if things are being done in the most efficient way possible. This is especially true for mature teams with legacy processes. These last five doctrines are a good start to ensuring that digital analytics teams continue to remain credible in the middle of constant change. So, to summarize, we talked of 10 doctrines in today’s session, which you can see all together in this slide. Finally, let me just conclude by acknowledging that life for a digital analyst is full of flaps. This dynamism is also what makes our jobs so stimulating. In the middle of this giddy digital ride, the 10 doctrines I just shared with you help me keep my team trudge on in the right direction. I hope they help you do the same too. Thank you.
recommendation-more-help
82e72ee8-53a1-4874-a0e7-005980e8bdf1