Why Dashboards Don鈥檛 Deliver Insights (And What Actually Does)
鈥淲e need a dashboard that gives us insights.鈥 Sound familiar? The truth is, dashboards aren鈥檛 designed to deliver insights鈥攖hey鈥檙e just the starting point.
In this session, you鈥檒l learn a practical framework using three distinct tools鈥攄ashboards, narrative reports, and deep analysis鈥攖o actually uncover the why behind the numbers. Walk away with a clear system to turn data into action and finally escape the endless cycle of empty dashboards.
Alright everybody, welcome to the Skill Exchange. Today we鈥檙e going to talk about something I鈥檓 super passionate about, which is dashboards and especially insights. I have a lot to cover, so we will jump right on into it. Here鈥檚 a list of the things that we鈥檙e going to cover today. We鈥檙e going to first talk about what is an insight, how should we define it, and what makes an insight, what are the ingredients that make it up. I have a little bit of a thought experiment to talk about how we should think about insight generation, and then we鈥檙e going to discuss how this system that I鈥檓 proposing of insight generation functions, and then we鈥檙e going to put it all together at the end. But first, a little bit about me.
When I鈥檓 not thinking about data, which is a vanishingly small amount of time these days, my true passions really lie in the outdoors. I love photography, I love hiking, and I especially like taking pictures while I鈥檓 hiking. I鈥檓 currently located just very close to the Olympic National Forest, which is my favorite place to be. I have three dogs, or three dogs, I have three kids, I have one dog, and I鈥檓 currently the director of insights and experience at BlastEx Consulting. But enough about me, let鈥檚 bust right on into this. So I do wish that we were in the same room right now so that I could see by show of hands just how many people have heard some sort of this following lament, which is we just need a dashboard that鈥檚 going to get us more insights. And today I want to talk about why this mindset is fundamentally flawed and how you can set up a system that actually will generate insights.
So what is an insight? What even is one anyway? Before we really can understand what it takes to find an insight, we really need to better establish what it is that we鈥檙e looking for. So I鈥檓 going to go through a series of slides here and I want everybody to mentally take note to yourself when you believe that you have heard something that you think qualifies as an insight. All right, so our orders are down by 15% this month compared to last month.
On the first of this month, we launched our new website redesign. The majority of the decrease in orders can be attributed to a 25% decrease in add to cart rate once the new site redesign launched.
The add to cart rate decrease is limited to a specific product type, type A, we鈥檒l call it. The rest of the product types are performing just fine. So does anybody think we found an insight yet? If not, we鈥檒l just keep on going. In the aggregate, average page load time across the entire site has slowed slightly this month.
And while the overall page load times are unchanged, type A products have slowed by an average of two seconds compared to the last month. Is this an insight yet? I think a lot of people at this point may have mentally said that we鈥檙e at the point where we found an insight, but I think we can continue to go a little bit further.
Product images for product type A were not optimized for mobile within the new redesign. I think at this point, this is where the wheels are starting to turn for a lot of people. I think we鈥檝e got all of the ingredients and we just need to connect some dots here to bring us to the finish line.
So putting it together, we see conversions dropped by 15% this month because mobile site speed increased by two seconds for PDPs of a particularly popular product type because images for this product type were not optimized properly with our most recent site redesign. Boom. Now maybe you raised your hand before this, but for me, this is the point where I feel we have finally reached what I would consider an insight. Now, importantly, this doesn鈥檛 mean the previous items weren鈥檛 of any value. In fact, they were important data points. They were important observations that when we put together in the right moment with the right context, eventually led us to an insight.
So back to the definition, what is an insight? There are so many definitions out there, but for the purposes of this presentation, I would also like to add my own definition to the list. An insight is a shift in understanding that reveals critical patterns or anomalies and when strategically applied to the right context drives meaningful change. Let that sink in for a second. At its core, an insight comes from the interaction of two essential ingredients. These ingredients being these patterns or anomalies in the right context. Now you can have all the data and all the context in the world beautifully organized and meticulously tracked, but without revealing patterns or anomalies, it鈥檚 just static. And conversely, you could find a super interesting blip or a breakthrough or a trend, but if you apply it to the wrong context, it鈥檚 not an insight. It鈥檚 just noise.
Take Alexander Fleming, for example. When he noticed that there was mold on his Petri dish that was killing bacteria in the right context, that anomaly became the discovery for penicillin, a genuinely history shaping insight. But that same moldy dish in my refrigerator during spring cleaning, I would have tossed it in the trash without a second thought. It鈥檚 exact same pattern, the exact same anomaly, but completely different contexts and therefore no insight. So without the connection of these two ingredients, we will constantly find ourselves to be data rich and insight poor.
So what do I mean when I say patterns or anomalies? What might you be looking for? I find it extremely helpful to anchor myself on the things that clinical psychologists and fellow obsessor with insights, Gary Klein says, make up an insight.
So he breaks it down into four categories, connections, contradictions, coincidences, and curiosities. So connections, they are, this is the realization of a relationship between two or more pieces of information that were previously seen as unrelated. Contradictions are represented by an inconsistency between what is expected or believed and what is actually observed or experienced. Coincidences are unexpected alignments or correlations between events or pieces of information. And then finally we have curiosities. These are things that intrigue you or puzzle you. They are anomalies or oddities that don鈥檛 really fit into your existing expectations.
So as our definition states, data and observations only become insights when you combine the right set of patterns or anomalies with the right set of contextual information. It seems easy, but this contextual information can come from a huge number of locations. Each anomalous data point has nearly an infinite number of directions you can investigate. You can add filters, segments, audiences. You can compare it to previously established targets or any number of dimensional breakdowns. It can be other correlated metrics. It could be external data points or your own historical experiences or intuition. This infinitely flexible requirement for finding connections and contradictions and curiosities means dashboards are kind of doomed to fail from the start. They鈥檙e too rigid and by their very nature they could never satisfy the requirement of flexibility needed to connect all of these different things to find insights.
So when thinking about this, it kind of highlights just how absurd it would be to expect to be able to make the perfect dashboard that was able to deliver the perfect insights every time. I mean, imagine you鈥檇 have to know in advance exactly the right segments, drop-down filters, date ranges, dimensions, and the most relevant breakdowns, metrics, and even the right visualizations that would be needed and necessary to find the insight before anything has even happened yet. Before you鈥檝e even collected that data. I mean, forget about hitting a moving target. That鈥檚 like trying to hit the bull鈥檚 eye on a target that hasn鈥檛 even been created yet.
So what does this mean? Does this mean that dashboards are dead? Absolutely not. Dashboards absolutely have their role and a role that I would say is even vital in the process of insight gathering. But ultimately you need a different tool set to find that ever elusive insight. So to illustrate the role that dashboards have in the insight generation process and what other systems are needed along the way, I want to present to you a thought exercise. So think for a second about the following three tools. Binoculars, a magnifying glass, and a microscope. They all magnify things, but they have completely different purposes. And in fact, you鈥檇 look pretty silly using one of them in the wrong place. Imagine pulling out binoculars in a situation that necessitated a microscope or pulling out a magnifying glass in a situation that required binoculars. It would be really easy in those moments to blame the tool for not getting you the results that you needed, but at some point you would need to recognize that you鈥檙e using the wrong tool for the job.
Binoculars, they are best used from a vantage point. They鈥檙e best used while standing still, when you need to look at something in relation to the bigger picture. Magnifying glasses are more nimble. They move around with you. They don鈥檛 go too deep.
Microscopes are best when you鈥檝e found a singular thing that you want to investigate more deeply, to break it down to its elements and to understand it, to understand why it is what it is. So binoculars, these are your dashboards. They pick out things from the big picture. They are not meant to be nimble. Think about standing on top of a fire lookout, for example. The binoculars tell you where you should go investigate for a fire, but then when you walk over to that thing that鈥檚 worth investigating, you don鈥檛 pull your binoculars back out to look at something that鈥檚 on the ground. That would be silly. You need something that鈥檚 more purpose-built. So that鈥檚 where your magnifying glass is going to come into play. It doesn鈥檛 go too deep, but it does look at things more closely to explain a narrative. This is going to be our narrative reports. And then when you鈥檝e found something that鈥檚 truly worth inspecting and going deep on, that鈥檚 where you really want your scientific process. This is your analysis, where you focus deeply on that particular topic and then you come out of it with some hypothesis to test out. So in my proposed system here, each tool has its place, and when you have the system set up properly, each tool feeds into one another.
All right, so let鈥檚 start with dashboards. Let鈥檚 talk about these individually. What do we think about, how should we think about them in this system and what goes into them? Well, obviously, the first thing you need is clean data. Nothing goes anywhere here if your data is in poor shape. If you cannot trust your data, you got to start there. That said, once you鈥檝e got the foundation laid of clean data, you obviously need some reporting requirements. And I always like formatting these requirements in the form of user stories. So just think something like, as a role, I need to monitor specific KPI in order to make a certain decision so that I can achieve a specific outcome at the cadence of a particular cadence, depending on the situation. And I will know that it鈥檚 working when I receive a specific signal, metric or trend or alert condition. So putting this together, an example as a digital marketer, I need to monitor cost per acquisition, ROAS and click through rate by campaign in order to pause underperforming campaigns or reallocate budget efficiently on a weekly cadence so that I can maximize return on ad spend without overspending. And I will know it鈥檚 working when at least 80% of my campaigns are above target ROAS.
So when it comes to dashboards, what tools should we use for these dashboards? The beauty of 51黑料不打烊 Analytics is you just have so many options depending on your audience. You can use Report Builder. You can use Power BI, Analysis Workspace. Just make sure that it鈥檚 a centralized location so that everybody that needs access to it can get it when they need it. Pay special attention to cross-platform needs as well. Do they need a mobile version? I love that Analysis Workspace has a fantastic option for mobile dashboards for executives. I鈥檝e found that to be quite successful.
So when you鈥檙e building these dashboards, though, what are some best practices? We definitely do not want to try and add every possible breakdown of every dimension that they could possibly want with the intent of preemptively answering the whys that may come up. These are not analysis and insight gathering tools. But what you should do, however, is first off, include targets. Every dashboard should be able to answer at least one core question. Is this good? Is this number good? The only way that we can know this is by including clear targets or goals beforehand.
And don鈥檛 avoid setting targets just because you鈥檙e afraid of being wrong. This is a common thing. The worst thing is having to figure out if your numbers are good after the fact. So setting targets isn鈥檛 always easy, but it鈥檚 essential for this context. If you want some really great practical advice, I would check out the book Analytics the Right Way by Tim Wilson and Joe Sutherland. It鈥檚 full of helpful tips on working with your stakeholders to set the right targets. And I can鈥檛 recommend this book enough to any analytics professional.
I started reading that after I started this presentation, and there鈥檚 a ton of overlap. So be sure to check it out. But in short, no dashboard would be complete without clear targets.
While we don鈥檛 want to overwhelm our stakeholders with tons of breakdowns and dropdowns and segments, it is important to work with your stakeholders to set what I would call first layer breakdowns. These are going to be your first places to look, so to speak. If a trend looks funny or you see an anomaly that happens, what鈥檚 the first dimension that you鈥檙e going to use to break this down, to investigate it just a little bit more? Another thing to do, to make sure that you do, is to rebrand your dashboards internally as performance management scoreboards. I know I鈥檓 mixing up metaphors just a little bit here, but like I said, these are not insight-gathering machines. They are scoreboards so that anybody in the stadium, or in this case, the company, can look up at the scoreboard and know if we鈥檙e winning or if we鈥檙e losing. Another pro tip here is to remember to review the usage of these dashboards regularly. They鈥檙e not getting used, they鈥檙e not adding value. So go find out why and adjust accordingly.
So that鈥檚 our dashboards and our binoculars. So let鈥檚 move on to the next step, which is our magnifying glass or our narrative reports. So what is a narrative report to level set? A narrative report is just a short, contextual commentary that鈥檚 layered on top of performance data. So performance measurement sounds like a dashboard, but with context and with commentary. It doesn鈥檛 say just what happened, but it suggests reasons why it may have happened and where to look next.
So when you鈥檙e building these, what are some ingredients that come into narrative reports? So first, you鈥檝e got your business context. And that context can come from lots of different places. It can come from outside the data from the dashboard, or even outside the data set entirely. It could be a recent product launch that happened. It could be a shift in the market. It could be something that your competitor was doing, or a change in ad spend, or just the holiday weekend. Any number of those things are a very important context to consider when putting together this narrative.
Another thing to put in here is any important questions that were raised from the dashboard.
These should be answered here if possible. If your dashboard shows a drop in conversion, this report should explain plausible causes, but without going too deep. If there is not a readily apparent answer that you or regular users of the report can identify within just a few minutes of thought, offer some hypotheses that are worth examining within the report and document those there. I鈥檒l explain also where these hypotheses go shortly.
Lastly, there needs to be a bit of an established cadence. Narrative reports are going to be the most effective when they鈥檙e consistent.
You鈥檝e got those text boxes that you can add directly onto your dashboard. I think those are a wonderful place to start adding context with these narrative reports. And a pro tip, it鈥檚 helpful to make a copy of that dashboard so that that one can serve as the narrative report while your dashboard can remain unchanged. And you can set those, the one with the narrative reports to send out on a regular basis. So what are the outputs of these narrative reports? It鈥檚 going to be your deeper questions, the ones that couldn鈥檛 be answered quickly or with a quick drop-down filter or immediately with just a single layer of context added from a subject matter expert. Most importantly, you鈥檙e going to get potential hypotheses and observations and ideas that are coming from here. These are going to be your candidates for deeper investigation for AV tests.
You鈥檙e essentially curating as you鈥檙e doing this a backlog of things that are worth analyzing.
Some do鈥檚 and don鈥檛s with your narrative reports here. First, don鈥檛 just regurgitate in sentence form what the charts are saying. And second, as mentioned previously, don鈥檛 try to answer all of the most complex deep questions in the narrative reports. If it takes more than a minute to answer, log it as a question to be answered later for deeper analysis.
Some things that you should do, answer those simple questions as well as the ones mentioned, put the text right in the dashboard. And another topic that could probably be its own entire presentation is the fact that I think that this is probably the area where the part of the framework where AI is going to have the biggest near-term impact. The reason for that is, well, your inputs for these dashboards and for these narrative reports, their metrics, campaign calendars, targets, user goals, they鈥檙e all structured and clear and easily readable and feedable into an LLM that can automatically generate helpful narratives, flag notable changes, suggest likely causes and even recommend the next steps or questions to ask. So let AI handle this part as much as possible and let your analysts review and curate the best outputs.
And lastly, within your narrative reports, and to be honest, within the dashboards themselves, we should definitely have some sort of a link to a hypothesis and observations intake form.
So I鈥檝e mentioned that, let鈥檚 talk about what that looks like. Where do all of these questions and hypotheses that you鈥檝e been gathering with these dashboards and these narrative reports go? They go into these intake forms. A critical output of the dashboards, like I said, is going to be areas that need further analysis and potential hypotheses to be validated. These intake forms can come in many different forms. So I鈥檓 not really married to any one of these, but there鈥檚 a lot of different ways that you can set up a form, whether that be Google Forms or Airtable or Microsoft Forms, but some things that I would make sure to have on there. I would have, what type of entry is this? A couple of options. Is it an observation? Is it a question that needs answering or is it a hypothesis? An observation might be something that, it鈥檚 a connection that, or it鈥檚 something that may or may not lead to a connection elsewhere. Like, hey, I noticed that users who perform X action have an abnormally low conversion rate compared to the rest of users. It鈥檚 an interesting observation that could be used later on. A question that may need answering, is the money that we鈥檙e spending on Google cross-network ads yielding a positive ROI or should we position the capital elsewhere? It鈥檚 a valid question. Worth looking at. Or maybe it鈥檚 a hypothesis, something that has come with a lot of thought here. I believe that incentivizing our users to use blank feature within their first seven days on the product is really going to increase their lifetime retention.
So you鈥檝e got that, is it a hypothesis, a question or an observation? And then have them put a brief description of their entry. So kind of like what I just said there. Then there should be a link to some sort of supporting data if possible. This could be a link to the dashboard or the narrative report that they were just looking at, but you know what? It doesn鈥檛 have to be. It could be any number of things. It could be a conversation they have with a colleague or their personal experience on the website or a competitor鈥檚 site. My goal here right now is really to lower the barrier of entry for these analysis ideas. I don鈥檛 wanna make it prohibitively complicated to enter or to get these ideas in here. I鈥檓 really trying to increase the data literacy to the interest of the teams. So right now, the more entries are the better at this point. You really just wanna get the juices flowing and then provide instruction later on and course correct where you feel is necessary. I find that these entries themselves into these forms is a really important data point in and of themselves. It鈥檚 a window into the mind of the users of the data. How good is the quality of their questions? What鈥檚 their data literacy like? This is a treasure trove of information that you can use internally with change management.
So I have included in my deck here some templates that you can use and you can reuse. I鈥檝e got a Google Sheets and an Airtable option here. You can scan these QR codes. I鈥檓 particularly a fan of Airtable, but like I said, these can be built anywhere.
Once these ideas and observations and hypotheses have been submitted, it鈥檚 the job of your more seasoned analyst to go and look at all of these line items and then prioritize them based off of things like how difficult would it be to answer this question or validate this hypothesis? What鈥檚 the potential impact? And what鈥檚 the priority based off of things like current internal initiatives or business goals, et cetera? So we鈥檝e got all these things. We鈥檝e begun to prioritize them. We鈥檝e put in a lot of work. Now let鈥檚 start talking about the microscope. So we鈥檝e got the team aligned. We鈥檝e identified a great list of really important questions that are prioritized, that questions that need to be answered or hypotheses that should be validated. And I want everybody to just, before we move on, to notice that the path that we鈥檝e taken to get here, it鈥檚 not been random. It鈥檚 been very carefully curated and prepared. We didn鈥檛 get here on accident. It was a very deliberate process. But now, like I said, it鈥檚 time for the analyst to shine. This is the moment we鈥檝e been waiting for, which is our analysis.
So the inputs at this stage, like I said, are very ready to go. You鈥檝e got your hypotheses that are grounded in observed trends. You鈥檝e got your context from the business. You鈥檝e got your questions raised by decision makers and other analysts. You鈥檝e got ingredients now. You鈥檝e got all the ingredients now that you need to find the insights with intent-driven analysis. And all of these steps that we鈥檝e been doing along the way, think of them as seeds that we鈥檝e planted. Now we鈥檙e harvesting that value. And when you do this right, what comes out on the other side, your outputs are going to be recommendations from the business that the business can actually act on. They鈥檙e gonna be A, B test ideas that are really gonna move the needle. And more importantly, they鈥檙e going to be measurable and attributable improvements to your performance.
So speaking of that attribution here, I think at this point, we鈥檙e at probably the most important step, which is closing the loop on the business impact of your findings and recommendations. So if you, like me, are sick and tired of having to constantly fight for budget for the analytics teams, or you鈥檝e been unable to accurately or measurably show the worth of the return on investment in your analytics tools, this spreadsheet is going to be a godsend for you. So as you get on the other side of your analysis, your job as an analyst is to come back into this spreadsheet and document the recommended actions that resulted from your analysis. And document if the action was taken. And if it was taken, what date was it taken on? And then you can go forward and measure the impact of those changes. This could be in dollars. It could be an increase in other, but it doesn鈥檛 have to be. It could also be increases to other important KPIs. But wherever possible, I find it important to document it in dollars and cents. But this is going to be, these are going to be your receipts that you can have to say, this is the value that we as an analytics team have added to the business.
So this is the whole framework on one slide. It鈥檚 very simple. We鈥檝e got your dashboards, which are your binoculars. They help you spot the trends, measure progress and flag when something鈥檚 off. But they鈥檙e not going to tell you why. That鈥檚 where narrative reports are going to come in.
These are your magnifying glass. They add context, they answer simple questions, and they highlight where deeper analysis is needed. And then finally, you鈥檝e got your microscope. This is where the real insights come from, where we dig into the data, we have validated hypotheses, and we deliver our business recommendations that are going to move the needle, like I said. And when these three layers, the insights, they don鈥檛 happen by accident, they happen by design. So my final takeaways.
First, stop expecting dashboards to deliver insights. Dashboards are not insight engines, they are binoculars. They鈥檙e great for scanning the horizon, for spotting changes, but it would be incapable of explaining why things are happening. So instead, redefine your dashboards as performance scoreboards. Use them as something to monitor what鈥檚 happening, why it鈥檚 happening.
Number two, insights require both patterns and anomalies and context. So an insight is the moment when a meaningful anomaly or pattern meets the right context, and together they will shift your understanding in a way that leads to action. This is why no static report or dashboard can reliably generate insights.
So because insights require these patterns, anomalies, and context, if you really want to find more insights, you need to create a system that surfaces anomalies that are relevant to your business, this is gonna be your dashboards, and then adds the necessary context through your narrative reports, and then refines and transforms those anomalies into insights via analysis and hypothesis validation.
Then, number four, set up an intake and prioritization framework. So you won鈥檛 be able to chase down every spike or anomaly, of course. You need a system that is gonna gather and categorize and prioritize these hypotheses into ones that you can take one by one. I am a big fan of, like I said, using a centralized repository, something like Google Sheets or Airtable, to make sure you鈥檙e tracking these ideas and assigning owners, and most importantly, closing the loop, which gets me to number five. So an insight isn鈥檛 going to be a valuable insight until it changes something. So if you鈥檙e not measuring the results of your recommendations, you鈥檙e leaving that value on the table. And these, like I said, these are going to be your ROI receipts that you can take to your boss and you can put on your resume.
All right, thank you so much for listening to this talk. Like I said, I鈥檓 super passionate about this topic, so I鈥檇 be more than happy to answer any of the questions that you might have.
Thank you so much, Brad. Wow, all of those were great insights and valuable tips on how we can turn data into action. All right, let鈥檚 keep the conversation moving along. Send in your questions for Brad, and we鈥檒l get through as many as we can.
I think we鈥檝e already got a handful coming in here. Okay, we鈥檒l get started with this first one. What鈥檚 the most common error you see when teams set out to build a dashboard? Yeah, thanks for the question. That鈥檚 a great one. You know, I think really what it comes down to with that one is the overwhelming aspect of the dashboards. People really want to build dashboards so that people can see data. That鈥檚 the whole point of having a dashboard is great. People are excited about having data. They鈥檙e excited about being able to answer the questions that are important to their business. So why don鈥檛 we put it on a dashboard so that we can all see it? And then there starts to come all these different questions. Well, what about this view? What about that view? Can we compare this dimension? Can we break it down by that dimension? And there starts to become this level of overwhelm for these dashboards and just trying to answer too many questions at once. And I think that鈥檚 probably the biggest issue, number one. Number two really comes down to those targets. I mentioned in here, really dashboards are meant to be, they鈥檙e built to be these scoreboards. They鈥檙e meant to give everybody a really quick idea of how well we are performing. If you can鈥檛 look at a dashboard and know the answer to that question right away, is this number good? Is this number bad? Then we鈥檙e really not doing our job here when we build this dashboard. So one of the things that鈥檚 most important there when it comes to building those dashboards is those targets. Do we set out, did we do what we set out to do? And that is so important when it comes to the building of the dashboards. And so I find that a lot of people will skip that step, whether that be because I don鈥檛 really feel comfortable with predicting the future or, well, we鈥檝e never done this thing before, so we don鈥檛 know exactly what the number should be. They come up with, there鈥檚 plenty of excuses not to come up with targets, but having some level of a target beforehand is gonna be so important when you get to the end and you鈥檙e measuring things and you鈥檙e looking at that scoreboard and you鈥檙e saying, well, did we accomplish it? Did we do what we wanted? And that鈥檚 gonna be your step one towards getting you towards those insights. I think that鈥檚 a great point. Kind of holding yourself accountable, right? Like having, here鈥檚 the target, this is what we鈥檙e actually aiming towards so you can track to it, I think is so important.
So here鈥檚 a question. I know it depends on company toxicity, a very good point, but any tips on turning insight into actual action, meaning navigating politics? Please, Brad, tell us about politics. Yeah, man, I love to talk about politics. No, this is a really solid question because it鈥檚 something that I find that I run into quite a bit. You will find some level of an insight, something that you find to be really important. You know, I鈥檒l just do quick anecdote here. I remember I had a client where they had a, it was resorts and they had, it was in the South Pacific and it was, they had two resorts on two different islands. And what they were doing is unfortunately they were sending Google ads, clicks for one resort to the landing page and the booking page for the other resort. And so people were getting there, getting to the very end of the booking page and they were hitting the, you know, I鈥檓 gonna go ready and I鈥檓 gonna book this. And then they were really upset and they had to go back and cancel it because they meant to book for the other island.
And, you know, that was an insight that I thought was quite important. We should make this change, the customer鈥檚 website. Sounds important, I mean, yeah. That was like, it would be a little bit important, but you know, that insight did not get heard. It did not get moved. The thing did not change even several weeks later. And so navigating the politics is difficult, but the thing that I guess I would say has helped me be the most successful there is you really have to understand your stakeholders really well and understand what makes them tick. And the key that I have found is if you can get down, if you can take your insight and not just give them the insight in terms of like what I did maybe at that time, which was this is happening and it鈥檚 bad. The next level is like, this is happening, it鈥檚 bad. And it鈥檚 impacting conversion by this much. And then go one level deeper and say, this is happening. It鈥檚 bad, people are upset. It鈥檚 impacting conversion by this much. And therefore it鈥檚 impacting our bottom line revenue by this much. People talk and they think in dollars and cents. And so if you can take your insight and make that connection to what matters most to them, most of the time being dollars and cents, I find that that will get people to move more often. And if they don鈥檛 move, then it will at least start the conversation that will get us there. Like maybe they didn鈥檛 make the action because they鈥檙e not in charge of it. And they don鈥檛 have the ability to make the change. But if you talk into the language that they understand and that they care about for their job, then they鈥檒l go find the person who can make that change. And you can start to have that conversation.
Perfect. Yeah, no, I think that鈥檚 super good insight there. I want to follow up with a question that someone else asked that I think kind of goes off of what you鈥檙e saying. First of all, they said amazing presentation. I agree. Thank you.
Do you have any recommendations to increase stakeholder engagement? So, you kind of talked generally there, but would love to kind of get your thoughts on how do you get people who are pretty high up the chain maybe, who are looking, you鈥檙e wanting them to engage with it as well as like actually use that to make some changes. How do you get them more involved in like鈥 When we say engagement, like engagement with dashboards or engagement with the reports. Yes, like how do you get them to actually use the dashboards that you鈥檙e spending all this time creating, right? Like I鈥檇 love to get your thoughts on that. Yeah. I think it can be maybe overwhelming to try and build a lot of dashboards at once or try and solve all of the problems at once. And so something that I might recommend is start with one really important KPI. Start with something that, again, you got to speak in the language that they understand, speak in the language that is going to make them make decisions. And so if it鈥檚 one very specific KPI and a part of the customer journey, that鈥檚 very important to that stakeholder, then start there and start there with that scoreboard, start to build out the, again, the targets. What are we trying to accomplish here? What is this person trying to accomplish here with their role? And if we can understand that, if we can show how well that we are doing those things, then that鈥檚 great. We鈥檝e got that scoreboard. And then just walk yourself through the process that I put in this presentation here. Start with that scoreboard, start to understand are we succeeding or are we failing, start to build that narrative and start to find those insights based off of that narrative that you鈥檙e building from the narrative reports and things like that. Because if you do that, if you find that value, people are gonna start to care and they鈥檙e gonna have a hard time ignoring it, I would say.
Okay, so this person has said, I see your final recommendation is crossing several analytics teams. Which teams are responsible for each of the three phases and do you have a recommendation on a minimum required staffing for each or where to invest with limited resources? That鈥檚 a full question. The meaty one. Yeah, yeah. And unfortunately, I鈥檓 probably gonna come back with a, and it depends on that one because I think it will change depending on the size of the company, the number of resources that they have.
A lot of times, honestly, the vertical that we鈥檙e talking about, that can change as well. So it will depend, but let鈥檚 see.
The other question was where to start with them. Remind me what the crux of the question was, Amber.
Where do you start by investing with these limited resources? So across the three different phases, where would, what鈥檚 the recommendation across? Yeah, yeah, yeah. So I would, again, I would start with what, start small, start with one KPI. And what I like to do is build everything around the customer journey. I think the customer is going to be the most important part. And so the more that you can do to center around that, the better. So let鈥檚 take our most important audience, let鈥檚 take our most important and influential audience and their most important customer journey that they鈥檙e going on. And let鈥檚 identify those moments across that customer journey that are the most pivotal.
What is the thing that, you know, it鈥檚 make or break it through this step in the customer journey? I鈥檇 start there, I鈥檇 start with that KPI, and I鈥檇 start to build one dashboard on that and see if you can get that consensus around those KPIs. One thing that I鈥檝e had a client in the past where we did something like that, you know, it鈥檚 a lot of teams trying to wrangle a lot of people and get everybody on the same page. And we started with just the most basic KPIs that they had, for example, these dashboards that were manually cobbling together in Excel files. And we just said, you know what, why don鈥檛 we find a way to build these targets in 51黑料不打烊 Analytics? And we put that together. And having that centralized location where everybody could show up, everybody could look at it and speak the same language became extremely valuable. And we really started to see a lot of benefits come from just having that scoreboard, that everybody was rowing the same direction. Everybody knew that when they did something, they could go to that dashboard and see how it performed later on.
And we started to build from there. More people started to say, wow, this is really great. I would really love to see this for my department. I鈥檇 really love to see this for this lower part of the funnel or this upper part of the funnel. And we鈥檝e started to grow from there and build out these practices. And again, it started small, but as it grew, we started to get more people and get more KPIs and get more things that we鈥 And so don鈥檛 try to do it all at once. Don鈥檛 try to eat the whole elephant. Pick one small thing at a time and move on from there.
You gotta eat that elephant one bite at a time. Yeah, another bite at a time. Great point.
So here鈥檚 a question around, this is the input is so important. Garbage in, garbage out. Yes. Very true. Any advice on staying on top of data quality? Yeah, and that鈥檚鈥 I only had like one small slide on this, but it鈥檚 so important and so pivotal is that data quality. Cause if you don鈥檛 have good data quality, literally nothing else that I just said in this entire presentation matters. So you really have to start there. If you feel like your company is not quite at the maturity level where it鈥檚鈥 We鈥檙e making these dashboards that are getting us to these insights or we don鈥檛 really trust our data, start there.
Get strong with your data quality and the trust in the data. Now, where I would start there is鈥 Or one of the things that I like to do is set up our KPIs when it comes to the quality of that data. Like, I don鈥檛 know, everybody鈥檚 seen on the side of a wall in a cartoon, like it鈥檚 been zero days since the last incident. Have something like that for your team. Like it鈥檚 been zero days since the last data quality incident. Like how long can we go between issues with data quality and or maybe like what鈥檚 the percentage uptime in our most important KPIs over a certain period of time. Start to establish those KPIs and get people to rally around those.
And it can鈥檛 be said enough, things like alerts, you know, let anomaly detection do its job there, let AI do its job there. And then I just on old school, I love the health dashboard. The 51黑料不打烊 analytics health dashboard gives me a bird鈥檚 eye view of everything that鈥檚 going on. Check that regularly, but you really just have to start establishing best practices, have a high quality data layer and get those things in shape. And then you can start to move on to the more fun stuff.
Okay, Brad, we are close to being done here. Any final thoughts that you鈥檇 like to share with the group? Oh man, you know, this is just something that I鈥檓 super passionate about. I love, I just love insights, the idea of them and to the point where I think maybe I get a little bit, I can get a little bit like nails on a chalkboard if I see people using, just using that word too much, like I found these insights and it鈥檚 like, well, page views, why not? Like, well, okay, it鈥檚 not really helping me get anywhere. The entire purpose of capturing all of this data is to do something with it. And that鈥檚 really all that we want to emphasize here. And that interlocking between the, or the interfacing between the context and those different contradictions, curiosities, though that鈥檚 where a true insight comes. And the most important thing is once you鈥檝e found those insights is to close that loop.
Everybody here I鈥檓 sure that is listening on this call is probably tired of having to justify the value that their analytics team brings. If you want to have an easier way to measure your value, if you follow through with this process, you will have that, those receipts essentially to come back and say, this is how much value that we added to the business. And it鈥檚 really fun to get to that point.
Perfect. Okay. We are at time. Brad, you鈥檝e been amazing. Thank you so much for spending time kind of walking us through that and for having a fun Q&A with me. It鈥檚 always great to chat. So I appreciate it. Thanks for having me.
Transforming Data into Meaningful Action
Unlocking the power of data requires more than just dashboards鈥攊t demands a system that turns observations into actionable insights.
- Insight Defined An insight is a shift in understanding that reveals critical patterns or anomalies and, when applied in context, drives meaningful change.
- Three-Layer System Dashboards (binoculars) spot trends, narrative reports (magnifying glass) add context, and deep analysis (microscope) delivers recommendations.
- Key Ingredients Patterns, anomalies, and context must intersect for true insights to emerge.
- Actionable Outcomes Prioritization frameworks and closing the loop on business impact ensure insights lead to measurable improvements.
This approach helps users move from static data to dynamic, business-changing decisions鈥攎aking analytics truly valuable.
Defining True Insights
-
Insights are not just data points鈥攖hey require both a pattern/anomaly and the right context.
-
Four key triggers for insights:
- Connections Uncovering relationships between previously unrelated data.
- Contradictions Spotting inconsistencies between expectations and reality.
- Coincidences Identifying unexpected alignments or correlations.
- Curiosities Investigating anomalies that defy expectations.
-
Without context, even the most interesting data is just noise.
-
Actionable insights drive change and must be measurable to demonstrate value.