NEW Tool:

Use generative AI to learn more about

Product Launch: has officially leveled up its integration with Snowflake’s new data quality capabilities

PRODUCT LAUNCH: enables trusted conversations with your company’s data and knowledge with the AI Context Engine™


Accelerate adoption of AI with the AI Context Engine™️, now generally available

Upcoming Digital Event

Be the architect of your AI-driven future at "Blueprints for Generative AI." 

View all webinars

4 Years of our Honest No-BS Podcast, Live from Gartner D&A Summit in London

Clock Icon 41 minutes

About this episode

We can’t believe it’s been 4 years since the start of Catalog & Cocktails! In this episode, Tim and Juan will reminisce on 4 years of the podcast, reflect on developments in the data world, and share takeaways from Gartner Data & Analytics Summit in London. We'll also close with tips for those looking to start their own podcast.

Speaker 1 [00:00:05] Honest, No-BS and it's here in London.

Speaker 2 [00:00:08] It's only Wednesday, and already we're tired.

Speaker 1 [00:00:13] Yeah. We've been at the Gartner Data and Analytic Conference in London, UK. It is the end of the conference, day three. And we actually are going to get together with a bunch of other inaudible and celebrate on Friday, because this is a very special week.

Speaker 2 [00:00:31] Yes, it is. Four years. Four years-

Speaker 1 [00:00:38] Four years.

Speaker 2 [00:00:38] ... of doing this podcast. Remember that thing that started alittle bit over four years ago, little thing called Covid? That was the whole start of it.

Speaker 1 [00:00:42] I can't believe it's been four years. It was over four years ago, right? And a pandemic was happening and we're like, " Man, what are we going to do to connect with people in data and talk with you? We're not going to conferences anymore."

Speaker 2 [00:00:55] That's how it all started.

Speaker 1 [00:00:56] Because it all started with Zoom, right? And just hanging out with some friends. And we were like, " Let's call it Catalog and Cocktails. We'll drink and talk about data."

Speaker 2 [00:01:06] So I'm just looking here, the first episode was May 13th, which was this past Monday, exactly four years. And we did 32 episodes actually, just the two of us talking. I think we had a couple of guests originally.

Speaker 1 [00:01:21] We had a couple of guests. It was DJ Patil, one time.

Speaker 2 [00:01:25] And we also had-

Speaker 1 [00:01:25] But it was mostly just us talking.

Speaker 2 [00:01:26] It was just us talking like that. Then inaudible was our first official guest, and then it just started from there. We've had now, a total of with this episode right now, I think it's 182. Somewhat like some takeaway you didn't have guests and there's some ranting episodes and stuff like that.

Speaker 1 [00:01:45] But roughly speaking, 182 episodes. Yeah.

Speaker 2 [00:01:48] Yeah.

Speaker 1 [00:01:48] That's crazy. That's crazy. Yeah.

Speaker 2 [00:01:51] Okay, really didn't spend time doing the takeaway, takeaway, takeaway. So off the top of your head, what are the things that have just marked you in the last four years of conversations we've had on the podcast?

Speaker 1 [00:02:03] So I think some of the biggest things that have struck me, four years, right? The last four years. The rise, fall, rise maybe, of the data mesh. We had Zhamak Dehghani on and we talked to a few different folks about data contracts and data mesh and things like that.

Speaker 2 [00:02:20] So we actually had Zhamak on inaudible. And since then was a huge thing, and then everybody there was like, " Oh, data mesh and data fabric just because of the Gartner stuff here." And now it's talk that much about products now all the time.

Speaker 1 [00:02:36] Data products up and down, right? Thing I'll note, and then I'm curious what you would say, is there's been an appreciation in data management to inaudible people. Empathy, curiosity, manage by that, that I think is true, but at least now is more explicit.

Speaker 2 [00:02:57] Yeah. People ask because it's always been something that has I think... Started four years ago, was something that was kind of implicit and was coming up, but then so many times over and over again, it would just become more and more of a big deal. And I don't want to start calling out a guest and stuff because I know we're going to miss out folks, but I will say that there was... I'll never forget, we asked inaudible advice. Erguss said, " Empathy, curiosity, empathetic." And that's always stuck with me, and I think that's what we need more. People has definitely been one of those things that has always been a thread throughout everything we've been talking on the podcast. So that's one of them. And then what else do you have? I got some notes I'll look into.

Speaker 1 [00:03:40] Things that come to mind are knowledge graph, obviously has been a big theme throughout a bunch of episodes. I always remember Bob Muglia coming on, former CEO of Snowflake, and talking about the importance of okay, from the relational world and the importance of the modern resurgence of the cloud data warehouse, right? The fact that knowledge graphs are going to be the future, governance is actually the first application on knowledge graphs. So I always remember that.

Speaker 2 [00:04:10] Yeah. And I will acknowledge that this is something that we have... I'm biased, this has been my entire career I've wanted to... Honest, No- BS, we have this flat push because I generally believe it's the right thing to go do, but I think also Cataloging and Cocktails for a reason, right? I think that that has consistently been there and gotten stronger and stronger. Not just the data catalog as a technology, as a platform, but it's all the metadata, the semantics. And now with GenAi, all this context and now definitely knowledge graph is on, is all over the place. And We'll talk later about what's going inaudible data consistently has been something another one of those times stronger. And then quick parentheses, we just jumped into this stuff and man, cheers. We got some cocktails here.

Speaker 1 [00:04:55] Cheers. Wait, so we're doing everything out of order today. It's been a long week. What are you drinking?

Speaker 2 [00:05:00] I'm just having gin and tonic. inaudible

Speaker 1 [00:05:02] That's traditional London style, yeah. I decided to go with more of less of a London tradition and more of a Catalog and Cocktails tradition, which is an old- fashioned. I always say, try not to repeat cocktails.

Speaker 2 [00:05:13] But that's the one we probably will.

Speaker 1 [00:05:13] But if we do, it's going to be an old- fashioned.

Speaker 2 [00:05:16] Anyways, cheers man.

Speaker 1 [00:05:16] Cheers.

Speaker 2 [00:05:17] Cheers to four years of doing this. All right, so we said rise and death of the modern data stack. Rise. I think it's death, whatever. Mesh fabric, whatever foes and friends, are they now? Part of that has been the data product, treat data as a product. Data product managers, that has been another of the big themes.

Speaker 1 [00:05:38] And those things have really stuck in terms of data mesh.

Speaker 2 [00:05:40] Talked about now, knowledge graphs, inaudible layers, I think they just-

Speaker 1 [00:05:44] Metric layers.

Speaker 2 [00:05:45] That's interesting because I think that was a little blip in time.

Speaker 1 [00:05:49] And so metric layers were inaudible time at Gartner D&A. I don't know why, if people are reminiscing or? But it seems like people are kind dots between when metrics layer was coming up? And now focus on semantics.

Speaker 2 [00:06:00] Especially because when people want to go have these chat with your data type of systems, these question answering systems, they're going to be asking questions about metrics. So the question is where are these metrics going to be stored? All right, social side of the... We talked about the people, teams, culture, literacy. Storytelling. I think storytelling is something that... We actually had a couple episodes of storytelling, but is this is something I believe that it's going to start seeing more now. Now governance, tie that together with data stuff. But governance has always been another theme throughout the entire podcast. And I think the big one that has been... Wasn't in the first two years I would say, but has been more or the last two years, is business value.

Speaker 1 [00:06:40] Oh, yeah. Data people, right? We have to be thinking about the impact that we're having on the business and become less obsessed about data literacy, the business people getting data literacy, and get a little more obsessed about data people having business literacy.

Speaker 2 [00:06:56] Exactly. And then we need... The other topic is the GenAI, the large language model. That has been one of the big topics right now. That's it. That's a fair summary. I mean, I'm inaudible

Speaker 1 [00:07:08] I'm trying to think what else. One of the thing I'll note is, I think as the modern data stack has involved in, I'll say of traditional best practices around data housing like Kimble and general sort of mensional modeling around your data, right? Secondly is good old inaudible it is at this point, but engineering. Just connect my systems together, I got to make the right things happen.

Speaker 2 [00:07:35] Yeah. And now with the whole AI and large language models, we're seeing the need for semantics. It's something that is now strong on people's mind, is we need to go invest in the semantics and the modeling, and that you want to be able to provide great beautiful data so we can give that context inaudible where it's going to come from.

Speaker 1 [00:07:54] Those of you that are listening, what have been some of your favorite episodes are, some of your favorite moments? And I'm curious, where do you think this is all going as we go forward? What's the next four years of Catalog and Cocktails going to be all about?

Speaker 2 [00:08:06] So yeah, if you think... Because look, the honest no BS, is that we don't even know what we're going to do next.

Speaker 1 [00:08:13] We didn't know we were going to do this for four years. Who knows if we're doing this for another four years?

Speaker 2 [00:08:18] We don't know. So just reach out to us, let us-

Speaker 1 [00:08:21] Talking about that, we should talk more about... What are we now talking about?

Speaker 2 [00:08:24] What should we start doing that we're not doing? What should we stop doing that we're doing? What should we continue doing? What should we kind of change a little bit? So yeah, just let us know. But on Friday, we'll be talking more about-

Speaker 1 [00:08:34] One last thing before we go on, we've got some different shirts going on. And I know that our listener inaudible. We've got new inaudible and actually we have an even newer logo, that maybe it will show up on a shirt at some point. Unfortunately for those who are watching the video, the owl blends in a little too much inaudible. The first shirts that we made are new and soon to be old again.

Speaker 2 [00:08:58] And for folks who will reach out, who are coming over on Friday to our inaudible.

Speaker 1 [00:09:04] Yes, yes.

Speaker 2 [00:09:04] All right.

Speaker 1 [00:09:05] We got T- shirts and... For those that listening, if you're a fan, ping us. We'll find a way to get you a shirt.

Speaker 2 [00:09:12] Yeah.

Speaker 1 [00:09:12] It won't have special-

Speaker 2 [00:09:14] That's for our next business one day.

Speaker 1 [00:09:15] Yeah, we'll start that business someday.

Speaker 2 [00:09:17] All right well, thank you. We're in the highlights.

Speaker 1 [00:09:21] Gartner, this has been a fun three days. A little bit biased, there's a little honest, no BS here and that world's catalog inaudible. Big theme throughout the entire conference, has been around that it in the importance of metadata to a lot of things of course, but especially AI. It is the context for GenAI as we go forward.

Speaker 2 [00:09:44] I think what has been very clear right now is that generative AI is actually reigniting the topic of metadata and it's like this forcing function. I think we acknowledge that we live in a chaotic world and we need to start kind of organizing it for AI and the way to go manage that complexity is through managing that metadata. So I think metadata was like that second- class citizen governance, ugh. And now useful with AI inaudible we need to be able to cleaning up the data. It's why quality is going to be so important and it'll be able to have all the lineage and observability to talk about trust and explainability. And then right now I think anybody who's thinking about AI, all these topics inaudible to metadata and inaudible coming the necessary and people are going to try to figure out how to take this right now. If you're not focusing metadata, your inaudible

Speaker 1 [00:10:41] And it was very late in the... I'll fast- forward because we're talking about metadata. Robert Thuner gave a talk on the space and where it's going. He had a few different themes about inaudible. So during the modern data stack saw everything kind of parts, now we see kind of the convergence where you've got the more unified data platforms come together and things congealing around certain spots. Talked about GenAI obviously, as being a really key thing. But then the third thing inaudible was actually not only metadata was thought of as exhaust, but now going forward we're going to think of metadata as a first- class citizen. You're going to be thinking it is in a standard format and I put it in my knowledge graph, feed it to my GenAI. So this is going to become a first- class concept in everything you do with every tool. Can I get my ETL metadata? Can I get my data warehouse metadata? Can I get my BI metadata? Can I get my semantic layer metadata?

Speaker 2 [00:11:37] And I think what's really important about the metadata aspect is that it helps you understand where you should put your focus on. So that's how you know what people are using, how complex are things? And then again tying it back to the business value, it's going to say, well I got all this data, here's what I'm trying to go do from my business perspective. Making that connection is going to happen through metadata to understand. I think like we're talking... I think in that same talk, there was about you want to do decision intelligence. And this has actually came up in the keynotes too, is you need to understand how this business is working and I want to understand how do we do operational excellence, how do we understand better our customers? All of this is to be able to go connect all your data. And this brings up too, another big topic which I am inaudible a key topic. And I wrote this on LinkedIn and it blew up a bit. Mark Bayer, one of the very distinguished analysts here at Gartner, he gave this talk on the top 10 trends of data engineering and integration inaudible graphs is key. And what is it they actually called the inaudible

Speaker 1 [00:12:45] They had a bullet that said that we could inaudible

Speaker 2 [00:12:49] So he does acknowledge that it is a state of mind, right? So again it's the Wittgenstein quote, think about how once you start thinking about that, it's how you're going to start integrating your data, your metadata, your inaudible. One of those transfer data engineer as knowledge graphs, knowledge graphs are going to be part of all your generative AI algorithms. This is the big trend. So I'm really, really excited and I'm so thrilled that this is like we've been doing inaudible.

Speaker 1 [00:13:21] This is the inflection point of knowledge graphs, which is exciting and it's becoming a central inaudible

Speaker 2 [00:13:28] That's one thing. Very clear that it means that inaudible is coming from the knowledge graphs. That was another big theme. So metadata, knowledge graphs will put that together. What else?

Speaker 1 [00:13:37] So this might be a segue. So it's something I want to come back to the opening keynote and I know we're getting things out of order here. But I think there's an interesting segue here because we talked about metadata, we talked about knowledge graph, we talked about GenAI. There was a weird in the area of GenAI at this conference, which I think that if we want to be critical, right? Honest, no BS, it felt... So you and I, we know a little bit more about the GenAI space. I know there's probably a lot of attendees at Gartner that are a little newer, still trying to wrap their head around it. For the content, Gartner focused around the importance of governance and a strong data foundation to create AI-ready data to then service GenAI. But there's a bit of a boost of the applications that inaudible are really focused on unstructured textual data either fed in through prompts or being fed as inaudible being used as inaudible general texture with AI agent. Data is treated very differently than what traditional governance and data quality and everything. We're talking all about these data GenAI applications are using unstructured data, this will eventually-

Speaker 2 [00:14:51] And this is not just an issue here at this conference, inaudible many different places.

Speaker 1 [00:14:56] It's inaudible.

Speaker 2 [00:14:56] And I think there's just the assumptions, right? Generative AI. Well first of all, you're using the word AI and generative AI interchangeably, when they're inaudible. And then we were like, " Oh yes, either use the word AI, you have AI already data." But look, when it's generative AI, it's usually tech, so what does data quality tech over structured data? What does data governance over unstructured data mean? You know what? Nobody knows. Actually people who probably know people who have been working on these content inaudible. It's not professionals who actually go work for it.

Speaker 1 [00:15:27] Feet in the structured data world. There are things out there, inaudible management systems, CMS. There's things like open text and other things out there. There's a whole space around unsure data, that kind of think about... That is actually probably closely related then the structured data stuff at the moment.

Speaker 2 [00:15:47] So this is the big disconnect, right? And actually it's super frustrating when people are giving talks here and even hallway conversations of, " But yeah, we need to use the principles management for data to go fine tune your LM." It's like, " You fine tune your LM with structured stuff. You're not going to use your traditional inaudible." That is a confusing thing. So that's something that I think everybody needs to wake up and okay, what you talking about? Unstructured data, structured data. This tool can really be very critical in this. This is inaudible

Speaker 1 [00:16:21] inaudible because it can apply to machine learning or more traditional AI threads. But GenAI specifically, there's inaudible

Speaker 2 [00:16:27] Not just from the Gartner but from inaudible

Speaker 1 [00:16:29] Yeah, bigger than just Gartner. It's the whole wrap their head around this. Yeah.

Speaker 2 [00:16:33] Talking about the ready data, what's interesting here is that-

Speaker 1 [00:16:36] That was the theme in the conference, yeah.

Speaker 2 [00:16:38] The AI ready was all over the place, but it's actually... They said AI ready data means it's actually data for a particular use case.

Speaker 1 [00:16:45] Yes.

Speaker 2 [00:16:46] But they left it as that. It's unpacking a little bit more. But no, no, I think people are still trying to figure out what it is. Especially because I think there's this confusion of unstructured and structured, what does AI ready mean for your unstructured stuff? So I think that's one of the... It's for your use case, but we need more details.

Speaker 1 [00:17:07] I was actually happy with that as a kind of recommendation. A little bit more details, but I do think I might be grappling with, am I AI ready or am I not? I think it is they're ready inaudible in the context of it.

Speaker 2 [00:17:23] inaudible you should think about, am I inaudible. Think about what things are going to change for the use cases. Again, they're going to be unstructured, which is going to be different for the structured and so forth. The same, there's a talk on data mesh and data fabric type of stuff, but we cut everything at the end of the day, the data mesh and all the work that credit goes to Zhamak inaudible podcast, I won't forget. Data products really change the data world.

Speaker 1 [00:17:51] Yeah, So just to remind our listeners here, mesh has main tenets. There's domain driven architecture or domain- driven ownership, right? There's a self- service platform. There's data products or data as a product, and then federated computational governance. So those are the four main tenets of-

Speaker 2 [00:18:13] And out of that, I think now we're talking a lot more about governance. So I think that's one thing.

Speaker 1 [00:18:17] Governance, yeah.

Speaker 2 [00:18:18] The data products, right? Our chief product and actually very specifically, I heard the definition. They put out. Three things, find the-

Speaker 1 [00:18:25] This is the Gartner definition.

Speaker 2 [00:18:26] Yeah, consumption ready. Second, it's kept up to date. And third, governance for appropriate usage. Look, I think that's very, very spot on.

Speaker 1 [00:18:34] That's actually a pretty clean definition.

Speaker 2 [00:18:36] Yeah. And what was interesting that they showed this up in one of the keynotes, was that they did actually a word cloud from all Gartner customers on what they were using the data products for. And they're business oriented and business value impacts. And the words were benchmarking, price negotiation, soft market cycle, customer experience, portfolio insights. So it's really great to go see that people are being able to go tie business value impact to their products.

Speaker 1 [00:19:01] Yeah, these are business- impacting use cases.

Speaker 2 [00:19:06] inaudible the data products. Think data products can go the devil is into the details, now the notion inaudible in particular use case.

Speaker 1 [00:19:12] And closely related to data products, in a few different sessions, I heard mention of inaudible role. inaudible talked about roles a lot, and both positivity and skepticism about new roles. But one thing about, and Gartner obviously is pushing this as well, is data product manager. Somebody who's really going to take product management principles and ownership and accountability and cross- business inaudible road map vision around these inaudible products. And it's not just about data, it can be data products, they can be AI products, they can be analytic products, but it's these products more and more product managers. Whether officially they have data or not, it's a practice we're going to see really, at inaudible

Speaker 2 [00:20:00] inaudible here and I got my notes. Business value, about these four principles in a way that CBOs are going to be doing. So change culture, create strategy, manage the function. And then what was really interesting, a lot of the focus today is around change culture and create strategy, but those are things that are moving the needle. And to move the needle we really need to just better manage function, right? And then governance. And what's really interesting is that on the governance side, it's how we're going to be able to empower and enable people to use the data for the stuff that's going to move the needle. So the focus always happened over and over again and also talked about. Fit definitely is going to be all over the place, especially now with more money on leaders. You can't be centralized, you can go franchise us out. So I think a very interesting point.

Speaker 1 [00:20:54] We've done a few networking events and stuff after the conference days, and I've heard quite a few people really resonate with this fact. And I find that honestly a little surprising, that some people will get so excited about it. But I guess it's distinctive because you got sort of this hub and spoke model being your decentralized aspects, but the spokes can't really exist without the hub. So that's kind of one model. You inaudible franchise model which Gartner inaudible. And I've heard multiple independent people that the Japanese McDonald's is going to be a little bit McDonald's, right? But there's some things that are them and so you inaudible and you incorporate, which is cool.

Speaker 2 [00:21:38] I think what we will... The CDO goes out, you have to understand the business. I think the, oh we only years or 18 months or whatever, it's because you don't understand the goal is probably over now I would think, and the thing they said is how does organization that values operational efficiency or if their value is intimacy, then you have to talk that language. If your focus is on being obsessed about customers, talk about-

Speaker 1 [00:22:07] Each company has a DNA.

Speaker 2 [00:22:09] Yeah. What else over here? I got more things.

Speaker 1 [00:22:17] inaudible intelligence. And I thought that was a little bit wordy and high level. I think about that choice of work and the way that they bought it, like the inaudible of animals. And one of those images that kept on showing up was of a bee. And I think we're all trying to grapple with GenAI and kind of advances in AI, as the knowledge that lives in our head and our heads and people are doing. And collective intelligence, they gave some examples. They talked about for example, where a company, a pharmaceutical trying to develop new drugs, it would take 10 years. This comes to deploy drugs in a few months. They gave another example where they talked about job posting, a job posting automation company and where normally it would take 90 minutes to write a job advertisement, and they were able to get down to five minutes. But what was interesting when they talked about all these examples, it wasn't just pure GenAI, like"Oh the GenAI did it." It was the collaboration between humans and machines working together. Knowledge out of people's heads, the things that only humans can do and then the things that AI can only do, and how those things work in sort of a symbiosis together. And so despite this word feeling a little icky and a little buzz- wordy, collective intelligence, I'm excited about that as a theme, to say how do we use the combination of human and robots to do the best thing?

Speaker 2 [00:23:47] An emphasis on solving problems and creating value collaboratively. And it goes back to one of the things that we've always had is, it's the people aspect. And right now it's not just automating things and make sure the machines do things inaudible stuff.

Speaker 1 [00:24:00] And working more effectively, and inaudible

Speaker 2 [00:24:05] Peter inaudible talked on his concept of the never normal. Phenomenal talk. You got to look Peter up, Peter Hinson. The never normal is something... We don't live in this new normal, it's this never normal. We are in a completely changing world all the time. And for something tiny like another change, and it goes back to... I bring up the example, you're in the Swiss inaudible a little bit and then the economy of the world can go kaput for a while because of that, right? Really changing. One of the examples I saw is how they were comparing how communications are changing from different generations. And there's this whole area, this whole group of... You have email, but young folks today they don't have emails or they don't communicate on emails. But if they have an email... What he said is vintage websites that need an email to access, that is why I have emails because they are this concept of vintage website. So these are the stuff that is now changing constantly, we don't even know what's going to happen next. We live in this world of unknown known, so we need to have this fluidity. So I think a bunch of great quotes that got me reminded, " The danger in times of turbulence is acting with yesterday's logic." That's a famous quote of Drucker. So I think-

Speaker 1 [00:25:19] That's a Peter Drucker quote, huh?

Speaker 2 [00:25:20] That's a Peter Drucker quote.

Speaker 1 [00:25:21] Interesting.

Speaker 2 [00:25:21] Oh, he had great quotes, I'm going to just name a couple of ones here. There is more fiction actually. You know when there's a moment where most fake news comes? During budget season. Because we come up with all these numbers and we end up writing that fake news. I wonder what our folks is thinking.

Speaker 1 [00:25:39] It's a bit of a business burn.

Speaker 2 [00:25:42] Another one is that we suffer, we call phenomena. Weird to need a bigger boat because we actually said, oh just more compute this and that. I think that's something that's going to be changing around.

Speaker 1 [00:25:51] We're going to need a bigger boat.

Speaker 2 [00:25:54] Anyway, that was a great theme over there to talk to.

Speaker 1 [00:25:57] Yeah, I love that.

Speaker 2 [00:25:58] And the end, one of the things I took away, the combination of man and machine, humans that is easy for machines. And what is easy for humans that can be hard for machines. So this is kind of the work that you're doing. Try to employ that, that's how we find that balance, right there.

Speaker 1 [00:26:13] Yeah.

Speaker 2 [00:26:14] What else do we have? Any closing? I got a couple more.

Speaker 1 [00:26:18] I'm thinking about AI stuff, thinking about the keynote around psychology.

Speaker 2 [00:26:24] If we go through again, how to notice change? There's a talk that I went to about behavioral science when it comes to stewardship, for stewardship. And I think these two things combine. You want to be able to... inaudible said, establish a common ground. So instead of I'm right and you're wrong and here's the data supporting that I'm right, let's focus on what we agree on, not only what we disagree on. When we think about psychology, is how we should be able to start thinking about we want to make these changes in governance, which how is this governance made? I mean, we know it's important right now but it's not cool. How do we make it cool? I think that's one of the things with behavioral science and how to change human behavior around things.

Speaker 1 [00:27:06] Yeah, I read the talk from Tally, and I think that what's cool is she tied in all of these psychology research aspects in terms of giving you an idea like, how to actually affect change in a way that's going to be lasting and ignited in place? And so one example I thought, at least for me, when she talked about and they were trying to get them to do inaudible with very high compliance, right? And when they first started off, the compliance was at around 30% for this particular, did they wash it right after when they needed to wash it? And what they found was if they just did must do it, they wouldn't change. But if they rewarded them, you washed your hands? And they put a little sign right at the sink, great inaudible right there. And there's an inaudible of, " Yes, I'm doing a good job." And the social proof of watching the number go as the entire team was working together. So our own businesses, things like your boss is like, " Good job." That's so corny, right? But then you think actually, these are positive psychology things that help us to do some of the things that maybe people like eating our vegetables, right? Some of the things that we don't necessarily want to do.

Speaker 2 [00:28:33] Exactly the point. I think you are bring this up again, metadata to do all this stuff is like eating your vegetables. It's like"Okay, how do we make eating our tools fun?" And I think these are examples from psychology and specifically, it's highlight opportunities for progress rather than sugar- coating it. You want to highlight what needs to be done to be able to get to that outcome. Identify these immediate. So there're small things that you can be done. Or also these nudges, right? And what was interesting on behavioral science, judges are there in front of you instead of you have to go proof, you want to highlight the positive reactions of others. So some people inaudible.

Speaker 1 [00:29:13] The example for the... I think it was the taxes, right? And she was trying to submit their taxes on time. And even though they would like, " Hey, if you don't turn your taxes in on time, blah blah, it's going to be a problem," it wasn't changing. Then they add one line on the notice that said, nine out of 10 people in Britain are submitting their taxes on time. You say, " Oh well I don't want to be the one out of 10."

Speaker 2 [00:29:42] That changed. Immediately, you could see the change around that stuff by adding that one extra sentence.

Speaker 1 [00:29:46] Yep. And it added another 5 billion or whatever in revenue to the rest of the country. Yeah.

Speaker 2 [00:29:51] So a couple other things you said, expand the sense of control, especially during times of change. So instead of telling folks what to do, give them a choice and when make that choice, they rationalized what they chose and they're confident about that. And I think this is one of the things that... This is how we can combine man and machine, and technology is, " Hey, we need to go do this." And" Well okay, does that mean that I should go and do everything?" But no, I think there's a balance and we can say, hey, the machine did all enough now to the person you say, " Hey, here's a choice, tell us which is the best one." So now they have to make a choice and we can now kind of do that combination of what needs to be stewardship with, what needs to be automated. And then another one is consider emotional state. So when are people the happiest? For example, they're asking a question if you want to go... You're going on vacation or the vacation or during happiest is the going to vacation because you have that anticipation. That immediate word is that anticipation for something that's fun. And actually there was... She asked people in the crowd, what is your favorite day of the week? Monday? Well nobody. Tuesday? No, no. Thinks Wednesday. Friday, was when most people raise their hand.

Speaker 1 [00:30:56] Thursday and Friday.

Speaker 2 [00:30:57] Thursday and Friday, Sunday wasn't... Even though Sunday that you're off and you should be enjoying yourself, but you just have the anticipation of the week. So Friday, which is actually a good work day, the day before your days off, is when you most anticipate your most happiest.

Speaker 1 [00:31:10] Yeah. When you actually get to Saturday, it wasn't as great as you expected, you're like, " Oh bummer." Right? Friday was the day you were excited. Yeah, it's Friday. Another funny example that she gave that was related that she called it a bonus item, was creating anticipatory events. And she talked about research where they said to the people participating in the study, hey imagine your favorite celebrity, right? Now imagine that they're going to give you a kiss, your favorite celebrity is going to kiss you. So hopefully you picked a good celebrity. So if I picked Tom Hanks or something, I don't know if I want a kiss from Tom Hanks. I didn't pick well. So they said, and you would expect people to be like, " Oh I want it now. I want it five minutes from now," right? But no, the peak was actually three. People wanted to relish in the fact that they were going to get a kiss. It's the anticipation of the kiss that is actually more powerful than the kiss itself. So anticipation. Create anticipatory events.

Speaker 2 [00:32:09] So we talked about AI, we talked a lot about being AI ready data within those data products. We talked about business value, we talked about the psychology is behavioral science. And obviously, metadata and knowledge graphs. I think inaudible where it still stands out?

Speaker 1 [00:32:29] I'm trying to look through and see in our notes, if there's anything else. I'm not sure if there is. I mean, one last thing I'll note is on the topic of AI ready data, I do think interesting to think about how AI ready might mean something different depending on also different AI techniques. If you're doing machine, if you're doing generative AI, if you're doing more traditional style analysis, being AI ready is going to mean something different. So I thought that was a good insight.

Speaker 2 [00:32:57] Oh, one more that came up a lot, was unexplainability and trust. And what's really interesting is that we live in this chaotic world and we need to go deal with it to be able to have all this data for AI and stuff. But one of the things that we've come from wanting to have this single version of the truth type of stuff, but now we actually... Even though we've tried and worked so much in having a single version of the truth, we've gotten to the point that we don't trust it, we don't trust anything because there's so much misinformation and stuff. So this is one of the big issues that we're seeing and I think the issue is not that we don't trust the data, it's that we don't even trust ourselves. That was an interesting point. And I know we always talk about garbage in and garbage out. So the problem is inaudible out, that whatever goes in, we all expect that to be the right thing. So this is why I think there's big... This is something we all need to work on and figure out how do we trust, how do we generate trust? And trust, that means that we need to really be able to go talk to the end users and tie this back to what are the use cases, right? So what are the expectations? What are the users, what are they expecting? We need to go have these, what do you expect trust about how we explain things, which we need a lot of more inaudible, really interesting. To explain the explanations of what's happening, we need to have and be very transparent about patient of things. And again, and all the inaudible metadata, those explanations, we need to have that storytelling to be able to share what that trust is, make sure that we're meeting with the expectations.

Speaker 1 [00:34:33] Yeah, exactly. I think there was a stat that 22% storytelling, which means that a lot of companies are starting to value the importance of data storytelling, but it should probably be way higher. They're really trying to make a difference. Just one last thing I can think of in terms of the answer, I think it kind of sums up a lot of what we were talking about today. That is the importance of the business value around data management, around data analysis. And some research at Gartner that basically showed that organizations that inaudible maturity really, they inaudible 30% financial performance. And I think inaudible leaders, we get wrapped up in the technology or we get wrapped up selves implicitly or explicitly cost center. We have to remember inaudible help, right? If you're not inaudible money. And we have to think like that. We got to think in terms of ROI.

Speaker 2 [00:35:33] And it's not just ROI, it's about the business outcomes. And I think that's what is important. Outcomes and inaudible value, make sure you inaudible. Very quickly, what didn't come up in the inaudible?

Speaker 1 [00:35:48] Things that inaudible.

Speaker 2 [00:35:50] Your agents but not here. And I'll argue, I mean Gartner, the honest no BS, is that Gartner usually covers things that inaudible. This is what's starting to happen now. So inaudible surprised if next year or at least in Orlando next year, we're hearing more about agent. I think that's one.

Speaker 1 [00:36:09] Yeah. That is a bit of a miss because agent based architecture is all the talk of GenAI and inaudible. That was very weird. Another thing we missed was that... We hit it a little bit, we were talking about position. There wasn't a lot of talk about unstructured data at all. Most of the conversation was talking around structured data. I mean text data, vector databases, unstructured data. It came up a little bit, but kind of more like it would show up and would be a thing in the corner of the slide. It wasn't a major topic, and I think that in the context of GenAI and around data management, how should data leaders be thinking about unstructured data? I don't think that anybody came away with an answer to that question.

Speaker 2 [00:36:50] That is true. Years starting from modern data stays alive and dead. Data inaudible for years. Metadata knowledge graphs continues to be a thing, right? Governance continues to be a thing.

Speaker 1 [00:37:02] Juan, where are we going to be four years from now?

Speaker 2 [00:37:03] Oh, I have no idea. Okay, I will inaudible going to be heading towards this world that will have our true assistance or that they know what they know, but they will know what they don't know and be very explicit about. And then they'll help us kind of learn more of what's missing. That is structured all the combinations and stuff and all these. I think that's where we're going and that's where I'm going to be working on and making that vision a reality.

Speaker 1 [00:37:37] That's interesting.

Speaker 2 [00:37:38] Where are we going to be in four years?

Speaker 1 [00:37:40] I'm going to pick one thing specific and that is... And I'm going to leave it a little bit of an unknown, so it's a little bit of a cop out. But I've been talking to Juan a little bit about this lately, and I've been talking to a few others about this. I call it inaudible is a reason... The power is shifting to whatever tool or platform owns the context of business. And I'm not talking about the data, I'm talking the knowledge.

Speaker 2 [00:38:06] The metadata.

Speaker 1 [00:38:06] Yeah, the metadata. The metadata and the ontology that hold the metadata together along with that, but needs to live in a system. The context, that context layer, that context engine, that's where the power is going to be. But the open- ended question that I just don't know the answer to is inaudible live? Live in your B tool? Live in your unified data platform integration tool? Is it going to live in your transformation tool? Your DB tool? live in your catalog or your knowledge graph, or your knowledge graph based catalog, right? And that's going to be the place where the context lives and that is the semantic layer. I don't know.

Speaker 2 [00:38:40] Well, I know. I think it needs to be in the knowledge graph independently.

Speaker 1 [00:38:43] I think it needs to be in the knowledge graph.

Speaker 2 [00:38:44] That's my position. And I'm not saying this from a salesy perspective. My evidence is I've dedicated my entire career to this shit, is that this is an independent layer outside of your inaudible. Outside, it needs to be able to connect stuff. I think it needs to be governed, it needs to be independent, needs to be based on open standards because this is the critical knowledge about your business, the context. Think about it this way, it's like you should be able to go describe your business and stick it in. This is actually analogy I bring from inaudible. You should be able to put your how business works to a thumb drive and stick it... And 100 years from now, somebody should be able to find that and assuming they can get the thumb drive, be able to get that. And because it's machinable and because it's open standard, they should be able to go in and oh, this is how this business worked. This is how this organization worked. So that's what you need inaudible. And so yeah, I think that should be independent. And I like this context where things I can see-

Speaker 1 [00:39:39] I think there's going-

Speaker 2 [00:39:39] ... turningthings around.

Speaker 1 [00:39:40] I think we're already starting to enter this context war period. This context war period is going to maybe last the good part of this four years. And if we make the right choices, then I think... When we answer the question, " Hey, what is the heart of your data stack?" Right now, I think today people would probably be like, " Oh, my data warehouse, probably the heart of my data stack." I hope that four years from now people will say, " My knowledge graph is the heart of my data stack."

Speaker 2 [00:40:02] Let's leave it like that.

Speaker 1 [00:40:03] Yeah, I'll see if we get there. Yeah, and I hope we will.

Speaker 2 [00:40:05] Cheers.

Speaker 1 [00:40:05] Cheers.

Speaker 2 [00:40:06] All right, to another four years.

Speaker 1 [00:40:08] To the next four years.

Speaker 2 [00:40:09] Well, I don't know if we're going to podcast for another four years, but we'll still be hanging out inaudible

Speaker 1 [00:40:13] The Honest, No- BS, will continue in one shape, form or another. Cheers to Data. world, that lets us drink cocktails and wax philosophical about data, and go to awesome conferences and meet amazing, amazing people like you. And thank you to all of you for being amazing listeners. We love you.

Speaker 2 [00:40:29] And here's a shout-out to all the listeners that we met this week here at Gartner. You're the reason why we can do this. There's just so many cool people that we get to meet. Thank you so much for everybody.

Speaker 1 [00:40:36] Appreciate the love, and cheers to you all and-

Speaker 2 [00:40:40] inaudible you Friday at four PM.

Speaker 1 [00:40:41] See you Friday. Cheers.

Speaker 2 [00:40:42] Bye.

chat with archie icon