ProductiviTree: Cultivating Efficiency, Harvesting Joy

Navigating Data Overload: Insights That Actually Drive Results with Nick Graham

Santiago Tacoronte Season 2 Episode 37

Most companies have more dashboards than they know what to do with, so why are they still flying blind? In this episode, Nick Graham, former global insights leader at PepsiCo and Mondelēz, reveals how to stop drowning in data and start making decisions that actually move the needle. This is a masterclass on connecting insights to real productivity and business impact.

Takeaways

  • Many companies drown in data without actionable insights.
  • A structured framework for data analysis is crucial.
  • Data should inform both strategy creation and measurement.
  • Insights require interpretation to become actionable.
  • Executives often believe data alone equals insight.
  • The role of analysts extends beyond just providing data.
  • AI will transform the insights landscape, but quality data is essential.
  • Focus on decision-making rather than just reporting data.
  • Corporate culture often prioritizes risk management over opportunity.
  • Effective communication of insights is key to driving decisions.

Thanks for listening to ProductiviTree! If you enjoyed this episode, please subscribe and share.

🟢 Spotify

🟣 Apple Podcasts

🟡 Amazon Music

🔴 YouTube

Connect with me:

Have questions or suggestions? Email us at info@santiagotacoronte.com

Nick Graham, welcome to Productivity. Thank you for having me, great to see you again Santi. Nick, you led insights at two massive companies. Why do so many businesses with tons of data still make slow and gut-based decisions? Yeah, it's a good question. um And from my experience, it's not just true in the companies I've worked with, it's true in lots of companies. think because fundamentally, many data rich companies are just drowning in data, right? But they sort of lack the mechanisms to really turn that into insightful decision making. Sort of in short, that noise to signal ratio is just still too high. So they're collecting a lot of data. they're inspecting huge amounts of data from all the different data sources, both internal syndicated custom data that they have. But I feel like often the challenge is um that they just get either paralyzed by it because there's just so much data or because of course, when you've got so many different sets of data, they will tell you different things, right? Depending on where they're coming from, depending on the timeframe. And so you create this endless debate about, is this right? Is this right? Should we use this data source? Should we use this data source? So what I will say is I think the companies that I think are doing this right, regardless actually of how much data they have, think fundamentally what they put in place is they have a, they create a framework and a structure for understanding which data sets and which KPIs within which data sets do they use for what decision, right? So they do a decision back data-driven hierarchy framework, whatever you want to call it, to help them understand like, what should I be looking at in order to make a decision? As a result, Instead of looking at everything, they can be focused on these three KPIs for this particular decision so that they can more effectively drive to a clear action or decision making process. And so I think this um focus on decision back use of data and really wiring that into your planning cycle, I think that makes the big difference. Regardless of how much data you have, that makes, I think, the biggest difference between those that are really getting the most out of their data and those that are just drowning in it and therefore as a result often sort of knee-jerking to, I don't know, therefore I'm just going to make, either not make a decision or I'm going to make more of an intuitive gut-based decision because all of this data seems to be telling me different things. How much should a use data and insights to measure strategy versus create and inform the creation of strategies? think both fundamentally. mean, think we should be using um data and insights obviously to shape the future or identify where our future growth opportunities are. mean, there is no magic, there's no magic eight ball, there's no crystal ball, I guess, in terms of being able to predict the future, but using data to be able to say where are the biggest opportunities for growth now, like where are the biggest opportunity gaps, and then what are the most likely sort probable future scenarios that will help us drive growth. I think that is usually where you do need to, you need to be more explorative in the data. When you're creating strategy, you need to be more explorative because you sometimes don't quite know what you're looking for. And so those can and should be, because they're typically part of like a three-year planning cycle, there should be lengthier exercises. There should be more debate and discussion about the data that you're looking at. the KPIs that you need to look at. mean, KPIs in that case, but the metrics, I guess, that you're looking at. I think the day-to-day performance, sort of ongoing performance, so more in the weekly, monthly um performance assessment and optimization cycle, that's where you need structure, you need hierarchy, you need to know, I'm not going to focus on these KPIs, I'm going to focus on this. I've seen that time and again, businesses in the heat of battle trying to work out. what's going on with the business and what to do next, endlessly cycling and spiraling on all of these different data sets and these metrics, when actually you need, on the historical data that you have, based on the relationship you know that exists between metrics, what's important, what's a leading metric versus a lagging metric, what drives the other metrics, you have a lot of data usually to be able to create that structure. That's where you need to be focused, like a few simple metrics, like what do need to be looking at? in order to be able to make a decision and to move forward. So I think there's almost two different, two very different sets of decisions on very different timeframes. And they need, in my opinion, totally different ways of managing the data and managing metrics. Nick, what's the biggest myth executives believe about insights? em I think there's a few. I think first of all, the data equals insight when that's not the case. You can be sitting on a ton of data, but actually interpreting that data, understanding what it means, that's where you get to a real insight. I've had a former executive I used to work with say, we're talking about being consumer centric and he said to me, well, it's fine, we're consumer centric because we have a lot of data. Well, not necessarily. You actually need to turn that data into insight, but then you also need to drive that insight into decision making. So I think this idea that it's not just about the data you have, it's that the funnel that you push it through to turn that into goal, to turn that into a revealing insight that reveals something about the consumer, the category, the business that you can act on. And then most importantly, that when you take action on it, you drive it into your decision making process. And so I think that funnel is really important. I can't tell you the number of times I've had marketers, salespeople, executives turn to me and say, we just need an insight on this. like, well, we don't just magically fall from the sky. I don't have an insight tree somewhere that I can shake for the latest insight. It requires some work to turn this data into a powerful insight. Can you share one insight driven decision that had a massive impact, cost, graft in your corporate career? Yeah, I think the reality is there's lots of medium sized ones. mean, there's some big, I'd say, um when I was working at Pepsi, there was some big strategic insights that unlocked growth opportunities. So for example, when we were working on the long range growth plan for PepsiCo beverages in the US, we did some analysis to look at what was the effective at the opportunity cost from not being not having a play in energy drinks, which at the time, you know, was a very and continues to be a fast growing category. And I think really revealed for the business, the missed opportunity cost and the missed share market share opportunity cost of not being in that space, which ultimately led to a number of &A and partnership deals that the team signed. So to me and the team, you know, there'd been a lot of debate within the organization about whether to do it. um and whether there was real value in doing it and whether we could get it from other means. But I think what the insights team did a brilliant job of is just visualizing and putting into very commercial terms for the team is what was the big opportunity cost from not being in that space and how that would continue to accumulate and continue to create a bigger and bigger market share drag over time. So I think there's some great meaty examples from the more strategic planning, so long-term strategic planning. cycle. But then in the day-to-day cycle, I can think of a couple of really good examples. when, again, when I was at PepsiCo, I'm also the, maybe if you want to call it tactical example, but it was a real, had real financial value. So we're working on em an advertising for Lipton Iced Tea, which is a joint venture between PepsiCo and at the time Unilever. um And the advert performed pretty well in the testing we'd done. But what we could really see was that there was a missed opportunity to bring the brand up front and to bring the brand much more front and center into the advert. The advertising agency hated the idea, but I and the team were passionate that not only would that make it more effective, but fundamentally, it would deliver better return on the advert. Because mean, if no one remembers the brand, what's the point of advertising it? So we forced the team to do another version of it. And then what I asked the team to do is once we got the results back is not just show result A versus result B, but I said, knowing from our marketing mix models, knowing that we know the connection between what measuring and how it perform in market, said, turn this into a dollar figure. And we showed that the dollar figure difference was several million dollars difference between these two different adverts. So what we're able to show is, look, you want to do this? You can continue down this path, you're basically leaving. You're basically saying, don't want the extra million dollars of revenue, basically. So we're able to find a way to make it really compelling. not only did we help generate that money for the business, but we use sort commercial acumen to drive the business to the right decision. And then across both PepsiCo and Mondelies, I think one of the biggest uh innovation things that I've seen is often innovations that either were over specced, right? So the product or the concept was really over delivering on what consumers really wanted and really needed. And so actually some great examples of insights helping to despec an innovation that was unnecessary, right? So obviously that saves cost and timing. But I think then some of the most powerful examples I've seen is working with our product R &D insights counterparts. Thank making sure that the fit between the promise and the actual product proposition is strong. So many times I've seen something almost get towards launch and then you test the product and the kind of idea together and it just doesn't deliver um on the promise that you're making through claims or advertising. And to me, I've seen time and again, that's when you spend all this money on an innovation, when sometimes some relatively small fixes on either the concept or the product can make sure that you then have something that's going to be more sustainable and is going to drive repeat purchase in market. an example I remember from when I was at PepsiCo, we working on uh a sparkling water sort of juice hybrid innovation. And it became clear that the concept massively overpromised on what the product really could deliver. The product was a good product, but the the concept would lead you to expect a very different product. And we saw that that would actually lead to poor repeat. And so what we did is actually the product itself was good. We went and fixed the story we were telling, the headline story, the proposition, how it would then come through in point of sale and advertising so that we were, I guess, better calibrating people's expectations. And as a result, we saw much better, uh maybe slightly lower trial because we weren't sort of over-promising. Thank repeat rate was much, stronger. So we were better able to sort of deliver on people's expectations. You said teams focus too much on the what and not enough in the so what. Is the scaling business agility? How does this impact uh progressing companies? Yeah, I mean, think it relates back to some of the stuff we talked about data, sort of overload and I've seen so many companies really just drowning in reporting what is happening. Sometimes the why, but often really just, know, chart after chart or dashboard after dashboard of here's all of the information cuts in, yeah, by geography, by customer, by region, by, and just overwhelming teams. with so much information. again, without two things, without a structure on top to work out what's important and where I should be looking within this, again, you end up with paralysis or you end up in a situation where everyone picks the data point that they want. We've all seen that in businesses where, to use the classic example, sales looks at one set of numbers, marketing looks at another set of numbers, and they're all going, well, the problem is that, because I'm looking at my my set of numbers in a specific way, as opposed to having a common way of looking at it and a common structure for how to look at the data or a hierarchy for looking at the data. Otherwise, it's just kind of noise and chaos. um So I think if you don't have that, it makes it very difficult to get beyond the what. And then I the other piece is what I still feel like a lot of teams do is as a result of all of this, they focus on um inspecting the information as opposed to going decision back. What's the decision I need to take? And therefore, how do I then look at the information and the data that we have within the context of that? Like, what do I need in order to be able to make that decision? And you know, very tactically, I often see monthly business performance review meetings where there's so much time spent on status reporting, like here's what's happening. Well, that's all very interesting, but what a waste of of people's time when actually should be like, fundamentally what's happening, we aren't off track and we should be spending more of our time on working out what's the decision that we need to make. And therefore, let's focus on looking at the data that's going to help us make that decision as opposed to just spending all of our time interrogating everything that's happening right now. mean, a couple of good examples is, again, somehow super tactical, but I think really revealing. When I was at PepsiCo in the global team, when I first joined, were looking at, you know, we would get the share market share reports from all of the markets or the data voted every month or however often they updated. And when I first joined, we were looking at every country equally, right? So every single country from the big to the small, we're looking at every single market share report and teams are pouring over. Why was Belgium's share declining, why was Pakistan share declining, not giving weighting to the scale of the market. So a very simple thing we did is say, well, given fundamentally the job of this team is not to try and opine on every market because we're not the market experts. Our job is to work out why the global share was going up or down. Just a simple way of weighting it, right? And saying, actually, and once you did that, you're like, no disrespect to Belgium, great country, great people. Great teams, but actually I don't need to worry about that because that's a relatively small part I need to really worry about you know, the US the UK India China, etc Where obviously the big market so just to refocus the team to make sure that if this is the decision we're trying to make or trying to Understand let's make sure we're focusing on the right thing and the same, know at Mondal's in terms of I would often see teams Looking at all of the business performance data and treating it all as equal, despite the fact that we know that certain metrics, for example, let's say distribution and pricing, we know from all of the analysis we've done, both at Mondelez, both at Pepsi, in every company I've worked in, that things like distribution and pricing are going to be huge disproportionate drivers of your base. So if you've got a base decline, you know you need to be looking at that because we have the data that proves that those are going to be disproportionate drivers. So Rather than treating everything as equal, again, if you've got a bass problem, we should be going, I don't care about anything else, let's look at first and foremost, do we have a distribution and pricing problem or a pricing elasticity problem before we start worrying about, you know, performance of our display, before we start worrying about our advertising? Like honestly, that's a relatively small part, important part, but it's a relatively small part of the mix. We need to be focusing on things that drive our bass. So this... This to me, to your question, this oh lack of focus, this lack of hierarchy and structure, and this lack of really being true to ourselves about what decision and what action are we trying to take, that's part of what really kills the agility and creates this endless cycle of navel gazing about what the data is telling us. Let's stop there for a second, Nick, and for the sake of productivity, which is the ethos of this podcast. Why does company settle into spending hours and hours nodding at charts? And everybody has the feeling that they're being very productive, you know, spending three, four hours on a business review where there is in many cases zero decisions made out of it. Why do we continue doing this? really don't know. assume that... Because I think still... Well, there's a couple of things. I think still a lot of... For a lot of corporate companies, time is... We're measuring how much time we're spending on something as opposed to actually what decisions and actions we're taking. I know when I was at PepsiCo, we did a big push as a... Initially, actually, as an insights team, and then it became a bigger... sort of broader initiative to restructure meetings. Every meeting had to have a clear set. You had to articulate at the beginning, here are the decisions we need to take as a result of this meeting. And it was, do know the funniest thing was, while it did make the meetings much more productive, people were very uncomfortable with that restructuring. It felt very alien to people, which just showed how embedded this behavior is, I think, in a lot of corporate teams. And I don't know whether it's because If you're running a small business, A, because of the amount of things you've got to do, but also because of that entrepreneurial mindset, you're like, I've got to make decisions or things I've got to do. Maybe because we're sitting in big teams and maybe we think we have more time to make decisions, it gives us this ability to act in this way. But I did see this real discomfort. I think the other thing is, being very honest, I think a lot of corporate teams, not even corporate teams, but corporate culture is about risk management rather than opportunity and value realisation. People are managing their personal risk, managing the company risk, but it's all about... And so I think there's sometimes this myth if I can just stare at the data long enough, I'll have the perfect answer and we can move forward risk-free. And that's just not reality. That is not reality. We all have to make probability-based decisions. And I think if we were just honest and said... rather than as wasting time in interrogating every possible piece of data until we've got the 100 % answer. Let's make, here's the decision we need to make. We need to make it by the end of this meeting. And what do we, what's enough information that we need to know in order to be able to make that decision? I think that would free us a little bit more. It was interesting. I was talking the other day to m another CPG uh client who m has very different culture. That is their culture. You know, it is much more sort of the record ABI type of model, right? Where the model is it's decision-driven. You focus on it's short, sharp meetings to make decisions to move forward. um But the person I was talking to was saying, coming in from a different culture, from sort of more of a classic CPG culture, how, in a way, how energizing, but also how terrifying that experience is, right? Because it's so alien to how a lot of... corporate CPGs operate today. I got someone telling me recently Nick that the game of Fortune 500 is not to innovate but make sure that nothing breaks. Yeah, I mean, I think there's an element of that. mean, I think while I think there is, as we know, I think in a lot of corporate teams, is this, while there is a desire for growth, it's always, there's such a strong sort of risk management or risk or making sure nothing breaks, as you say, that there's these two, including like, let's make sure we don't spend too much money. So there's these countervailing factors all the time. And the reality is, is in a... In a smaller business, whether it's a CP consumer goods business, an agency business, whatever, your risk levels are a little bit lower because your chance of losing things are smaller, your base is smaller, but your opportunity in your mindset is acquisition growth, right? So your scales balance differently. I think the reality in a lot of corporates that sort of risk management do no harm is much higher as a input in the balance. When insights get ignored, and we know they do, times, who's to blame more? The analyst or the decision maker? Both, ultimately. But I think the dynamics differ hugely based on the teams. think I have seen teams where insights, quite honestly, just isn't being commercial and impactful enough, right? So it's, know, they're bringing maybe great data and great insights to the table, but then they're not being commercial enough, they're not being pushy enough, they're not being activation oriented enough. And so... Maybe what they're saying is 100 % is absolutely spot on, but actually they're not able to translate it into something that the business either compels the business or inspires the business to actually take action against. Having said that, I've also seen teams where, and I've worked with both the PepsiCo and Mondly, some outstanding insight storytellers who are commercially oriented, who understand the business, who understand how to show up, but then are faced with a decision maker. a marketer or salesperson, whoever it might be, who doesn't really want to listen because it isn't really what they wanted to hear. Right? It's not the, how many times have I sat in a meeting where, you know, whether again, the example from the advertising one at PepsiCo or innovation meetings are a classic example where you come with bad news, which is, know what, this innovation idea, this advertising idea isn't going to work as it stands. Now, I think where insight sometimes struggles is they come with the They come with the bad news, but they don't come with a solution. They don't come with a like, as a result, this is what I recommend we do. And I think that's sometimes the gap, the opportunity gap is insights needs to come with more problem solutions as opposed to just like, here's your problem. But also, you know, I think as a, goes back to that sort of data-driven decision-making. We all need to, a, we need to, we need a culture of, if that's what the data says and that's the right, you know, that's. that's the decision making structure that we've put in place, then we need to respect that and we need to work out how to solve for it as opposed to just ignoring it because we don't like it. let me tell you, know I've worked on many say innovation is a good example where we've seen in the data from the beginning that this thing is going to struggle and yet no one wants to hear it because we've already committed it because people like it. And then it gets into market and it struggles. And then you're like, well, that was a... potentially a big waste of money, right, that we put out into the market. Nick, a few years ago I wrote an analytics framework which had the intention of making sure that analysis and insights ended up in action change. Good or bad? On it I stated that the role of the analyst is not finished with at the point that he gives the insights. Do you think analysts should follow through decisions and stick together with the decision-maker until we see results. oh agree. I think it's interesting, I was working with the team a couple of weeks ago and one of the big um sort of ahas, I think, for the team was recognizing that they thought their job was done when they delivered a report. know, bottom analyst report, piece of research, whatever it may be. And I think the big ahas for them was like, no, the job is done when the business takes a decision, right? Takes, hopefully, a decision that's influenced by your research, but it's or your analysis, but that's when the job is done. And so I think um the sort of shifting of skills, and I think we'll see this more and more, right, with AI taking more of the sort of grunt work, if you like, of analysis and research and some of the heavy lifting. I think the skill sets of both analysts, researchers, anyone in insights is going to shift to A, briefing and prompting, like making sure that we're asking the right and defining the right problem in the first place. but then also into the influencing leadership activation skills of like, do I translate this into action in the business? And as you say, how do I follow it through with the decision maker to make sure it actually gets landed into action, which involves making creative leaps, which involves coming with solutions, not just coming with what the data says and coming with recommendations. And that for some insights and analytics people, that is quite an uncomfortable space, right? Because they're quite comfortable in... This is what the data has told me. But actually what we're going to need them to do more and more is, and so what, and here's what I recommend, and here's how we translate it into business, here's how we deal with the messy compromise of decision-making. Hmm. Let's stop there for a second, Insights, analysts, it's one of the professions as a whole or the fields that it's definitely threatened by AI. It's very simple to get a detailed, very detailed analysis from any of the generative AI or LLM's models that we have today. They have advanced very quickly and you hinted some things, but could you give us a bit more insight on what would be the future of humans? on an insights department and triggering decision and analyzing all this data, which they might not even need to analyze themselves anymore. Right, right. And I think um that's the big tidal wave of transformation that's coming for Insights and Analytics teams, whether we like it or not. in the work that I'm doing now, what I'm trying to help teams understand is it's going to happen. So best to be um on the surfboard, on the top of the wave. driving the way forward, you like, but at least being in control and somewhat in charge of the destiny as opposed to just being a recipient of that because it is coming, right? And as you say, LLMs, digital twins, these things are all coming. And so, and they will transform and are transforming how we do work both for agency partners and for insights teams. And so I think in future, the reality is gonna be is that if you think about the three sort of stages of any research around analytics piece of work is, there's the before the work, right? Which is what's the business problem? What are we really trying to solve for? What do we need in order to be able to solve it? What data do we already have? Does the work itself? Does the research and the analysis? And then there's the post work, which is landing into business. What does it mean? How do I turn it into recommendations? How do I land it into action? Drive it to decision-making? I think right now, realistically, 80%, probably more. of most analysts and researchers time is in the middle bit, right? It's the doing bit, whether they're doing it themselves, they're briefing an agency, but it's stuck in the middle. And I think in reality, in future, because AI can and will do more of that bit in the middle, the future analysts may not even touch or be that involved in that work. More of their work will be on A, the briefing, getting the right brief, the right prompt, making sure that... um ultimately the more systemic process, making sure that the LLM is trained in the right way to do the analysis in the right way. So the sort of briefing, setting up training, data ingestion, making it the right shirt, the right quality data. think that will be such a critical role for insights teams or analytics teams. And then just as importantly on the back end will be oversights, making sure there are no hallucinations and biases. making sure that we're interpreting it correctly, you challenging the LLM. So there will be sort of a bit of the sort of the back end of the, you know, the middle of the funnel to make sure that we're really understanding and interpreting the data correctly. But then a huge part of it should be, and I think this is where rather than being scared, insights and analytics team should be excited, which is rather than all of our time, they're being sucked up in the middle. Like this should free us up to aid you more at the front, but then spend much more time at the back end in terms of landing it into action in the business. So being the consultant, being the more of an advisor to the organization in what does it mean? To me, that's the big productivity unlock is that's where we can then spend much more of our time and therefore make sure that all of this huge amount of data and research that we're doing and analytics that we're doing actually gets into action because otherwise it's just It's just work, right? It's just volume of work. It's not volume of impact, which is where we need to be focused. What mistakes do you see Insights teams but also companies in general making today with AI? I think there's a couple of big things that stand out to me. So one is thinking that AI is some magical, I think fundamentally thinking that AI is some magical solution to poor data management and poor processes, right? And I think as you and I both know, AI won't solve those. If anything, all it will do is it will exacerbate those problems or it will get blocked because of those problems. um I think what teams are particularly struggling with is it's the classic problem of garbage in, garbage out. So one of the biggest challenges is making sure you have good quality, well-structured data. So that's often like a biggest, one of the big challenges to make sure that we have that um infrastructure in place in the first time. I was working on a project a couple of months ago when we set off starting to build the LLMs. And it very quickly became clear is we didn't really have the data we needed. We didn't have it structured in the way that we needed to do it. And so imagining that it was going to be this huge productivity solve without doing some of that data structuring in the first place is a pipe dream, is a myth, right? You need to do some of that in order to be able to start to get the most out of the LLM. But then I think the other part is is recognizing that You can't just dump the tools into the system and hope that they will work. You need to skill people up to be able to use them. We talked before about we're going to need just on the insights and analytics side in particular, I guess across lots part of the team, helping them understand how to prompt in order to be able to get what they want. Looking for biases, hallucinations, that sort of sense checking that oversight. Those are going to be new skills that we need to build in our teams. And then we need to make sure that we have the right processes in place. Because again, if you just plug it into an existing process, let's say, you know, back to where we talked about at the beginning, if you don't have a process that's around data-driven decision-making, right? So here's the decision we need to make, here's the data we need to make, we need to be able to make that decision. All AI is going to come in, is just going to continue to exacerbate the same problems that you have in your process. You won't get the productivity value if you don't also use this as an opportunity to say, how do we need to work differently to be more effective and to be more productive? And how can AI power that, not just make our current um inefficient systems just a bit faster? um Nick, let's do some rapid fire questions with answers in 30 seconds or less. Number one, what is the most overrated analytics trend right now? I would say the push for kind of complete data integration and unification because while I understand the dream for it, most of those working in big corporate CPGs also know that that could be a total pipe dream and take huge amounts of time and energy to deliver. And I will say, I think there are ways around that, more creative ways around that, particularly with generative AI. some ways that we can solve for it without having to do full sort of data unification. Number two, one dashboard you will delete forever. Honestly, 80 % of them, because I think 80 % of them are just repositories of information that probably no one has looked at in six months, or certainly no one has taken a decision on in maybe ever. Tell us the one question every insights leader should ask before starting a new project. What decision are we going to take as a result of this work? And if the person doesn't know or can't answer it, we should not be starting work on anything where we or they do not know what decision or action we're taking. What is one meeting that should always be replaced by an email? again, probably 80 % of them, status updates, like the number of times, particularly in my old job when people were like, we're just having this meeting so I can bring you up to speed. I'm like, well, why didn't you give me an hour to read your deck? And then I would have known everything. And I don't need you to walk me through each page of your status update. Tell me what do you want me to do as a result of this? And number five, what's your favorite AI tool you're using today and how does it help your work? It's not super sexy because everyone's using it, but chat GPT is my go-to for work, life and everything in between. So it does drafts of uh content, thought pieces for me. I mean, I've really learned, had to learn how to prompt effectively to get what I want, right? You can't just throw anything in, but um now that I've moved to Portugal, I'm also using it to try and learn Portuguese. I have a tutor as well. I have a human as well. But I also use ChatGPT to teach me things to learn with as well. Wrapping up this conversation, Nick, what's one immediate step that our listeners can take? mean, professionals, people working in any department, any function, how they can connect their insights better to decision making and to faster execution. Yeah, I would say I think the biggest thing that anyone can do right now, regardless of what team you're in, is to look at what you're doing today, dashboards, reports, whatever it is, and say, how is this helping us drive a decision? And it always comes back to that. What decision is this driving? How is it helping drive a decision? What are the blocks or the issues that are standing in the way today? And I think if you can audit everything like that, you will actually start to be much more effective and you'll start to identify where the time sex are, where the sort of waste of, of energy are. How can people and companies and professionals connect with Unique and how can they avail your insights and fast decision-making uh consulting services? Sure, well, come and find me on LinkedIn. You can find me on Nick Graham or Vertimus. Probably there's a link through from this podcast as well. And yeah, I'd be happy to hear from anybody who's either doing amazing things and that we can all learn from, or anybody who needs some help in transforming their insights teams. Thank you so much, Nick, for this insightful conversation. There are a couple of things I'm taking away today from your own words and quoting you, insights doesn't fall from the sky. And even AI is not gonna help you or help companies just grab ideas and opportunities out of nowhere. The second one is that it seems that we have abandoned a little bit the culture of decision-making in favor of the culture of entertaining data. for the sake of God knows what. And I do hope that companies take good note of it because in the end companies what they need to do is take decisions to grow. That's it. Nick, thanks so much. We hope that you have a fantastic summer in Portugal and we hope to talk to you again soon. Awesome. Thanks, Auntie. Good to you.