You can’t get from your home to the grocery simply by owning a car. You have to actually drive the vehicle to get to a place that delivers value. Sounds obvious right? But we don’t instinctively think this way when it comes to data. We focus so much on tools, processes, and architectures, but we don’t talk enough about actually using the data.
The Catalog & Cocktails team was joined by Mike Ferguson, managing director of Intelligent Business Strategies Limited, to look at why we’re not using data nearly as effectively as we think and what can be done about it. Below are a few questions excerpted and lightly edited from the podcast.
Honest no BS here. With all this data that we are generating, are we really underutilizing it?
I think the last decade’s been about development in my opinion and it’s been really focused on all these new technologies. It’s been a complete frenzy around data and analytics quite honestly and the speed at which technologies have leapfrogged each other has been crazy. One minute it’s one thing, and next minute it’s the next thing, and the next thing, and the next thing.
So I think, for that reason, there’s some top-down pressure from executives now who are financing all of this to say, “Okay, enough’s enough. We want to not just experiment here. We really want to industrialize this.” And I don’t think they feel they’ve got enough use out of the data.
When new technologies emerge, it’s not surprising people leap on it, and they want innovation. But I think now that at least the top-down pressure is, “can we industrialize this,” because we really want to get maximum value out of this data and out of these models that are being developed.
What does it mean to be using your data, and why are people getting the feeling that they’re not using it?
I think it’s not just so much the data that they’re not using. It’s the fact that they’re not getting enough analytical models deployed. It’s the fact that they don’t even know what models are out there.
Companies can’t see that. The executives can’t see that. I mean, they know that there may be some real-time models being developed to stop fraudulent transactions. They might know that there’s some kind of activity going on in order to maybe with a graph database project or something like that. But seeing this whole collection of things that are being developed, what data, what BI reports, what dashboards, what predictive models, what prescriptive models, where are they being used, how they all work together or the goal of reducing fraud. That’s what they can’t see.
And because of that, they don’t have a strong enough feel for whether the collective effectiveness of all of that data is working to the maximum.
I can imagine people saying, “BS. In my organization we definitely use our data. We don’t have those problems.” What is the litmus test? It’s not a binary thing. We use the data. We don’t use the data. I think it’s a whole spectrum.
What is a test to understand how much we are using our data?
I think you need some kind of maturity model in order to be able to help you work that out. Are you measuring contribution to business outcomes from anything that you’re producing? There is lots of good stuff happening with data analytics without a doubt. It’s just that I think it’s not integrated enough. We still have a lot of silos, and I think there’s an opportunity to, as I said before, industrialize it. And I think what people want to see is can we produce reusable data for multiple analytical projects without having to go and reinvent again. Are there ready-made data products that we can use?
And then the second thing is if I’ve got analytics, is there a catalog of what analytics, reports, and dashboards are available? It’s classified according to business outcomes. So I can kind of see which models are in play and where, which processes are they being used in. And I think this is … it’s the business deployment side that I don’t think people can see well enough. But I genuinely think the ultimate measure of success is how much you’re contributing to a business outcome.
I’m so excited about all of this. And like for me I’m like, “data literacy, data culture, data usage.” We’re not talking about data usage enough in our organizations. But who’s driving it? Is it the job of the Chief Data Officer to be like, “Everybody, pay attention. Here are the metrics… I have the buy-in from the executive team.” Is it their job?
Whose job is it to make sure we drive better data usage?
That’s a great point actually. I mean, I think it’s more than just a CDO. The CIO has become just a more senior executive. What I am seeing now is collective responsibility for this broadening across the executive manager. Rather than just the CEO [or other c-suite executives], it’s now multiple executives taking up responsibility for this. And that’s a good sign in my opinion, because it’s not just one person at the table that’s accountable for this anymore. It’s the fact that this is spreading and people realize that this is a good thing that they’re taking up responsibility for trying to make this happen.
Are data literacy and data culture the biggest deterrents, obstacles to data usage?
I think data literacy is about confidence, isn’t it? I mean, not enough people have enough confidence to take advantage of data. And I think we need to get more of that out there. So I think data literacy, data culture, is this hidden thing, and then the question again is, “what’s data culture about?” Data culture is about how you persuade your organization to have mass confidence and change. So I think that’s why I believe that measuring contribution is so important. Because if you put up a business strategy and [have] strategic goals, here’s what we’re trying to achieve and here are our outcomes. What you’re doing is giving all these project teams something to aim at.
- Data Complexity is a huge issue
- Connect the data products to an actual use case
- Centralized vs Decentralized → instead is a federated model
Visit Catalog & Cocktails to listen to the full episode with Mike. And check out other episodes you might have missed.