About this episode

When we think about good, responsible data stewardship, we often think in terms of compliance and governance. We talk about access, trust, and consumption. But there’s another dimension we must consider if we’re to be truly responsible with data. We simply can’t ignore the role of ethics in data use. 

Join Tim, Juan and special guest Partha Srinivasa, award-winning CIO/CTO/CDO and current CDO of Verisk, for a conversation about data responsibility and ethics.

Special Guests:

Partha Srinivasa

Partha Srinivasa

CDO, Verisk

This episode features
  • Common misperceptions about data and ethics
  • How to build ethics into your data practice
  • Can you be both ethical and a fan of the morally dubious Seinfeld?
Key takeaways
  • Be responsible; use the data for what you say you’re going to use it for.
  • The industry as a whole needs to influence the change around the importance of data ethics.
  • SPIRIT: Security, Privacy, Inclusion, Responsible/Reliable, Impartial, Transparency

Episode Transcript

Tim Gasper:
Welcome to Catalog and Cocktails. It’s an honest live conversation about enterprise data management. It’s honest, it’s no BS and it’s non salesy and we’ve got beverages in our hands. Hi, I’m Tim Gasper, a longtime data nerd and product guy joined by Juan.

Juan Sequeda:
Hey Tim, I’m Juan Sequeda, and it’s always a pleasure to be here, take a break middle of the week, end of the day, and chat about data in an honest no BS non salesy way. Today, we have a fantastic guest. Our guest today has so much experience in enterprise data. It’s Partha Srinivasa, is the current chief data officer of Verisk, which is one of the world’s leading data analytics firm. If I’m not wrong, Partha, you’ll correct me here, I think almost every insurance company uses Verisk data analytics. Partha is an award winning global CIO and global CTO/CDO. You’ve been in the C-level for financial and insurancing companies for the last 20 plus years. If there’s somebody who knows a lot about data in the enterprise from an executive, that’s Partha, and Partha, it is a pleasure to spend this time talking to you today. How are you doing, Partha?

Partha :
Thank you, and thank you, Tim, for having me. Again, this is going to be a no BS conversation too. So first of all, I’m certainly loving this conversation. I thank you for inviting me. I know I couldn’t have done better job of explaining my background, but you guys gave me too much credit. I’m learning every day, so I’m still learning. This data and technology is so close to me that you got so much to learn and I’m still learning. So all those titles and all that don’t mean. It’s just that I’m still a student in the world of data and technology and I’m learning every single day, and this is an ongoing change.

Juan Sequeda:
I appreciate that perspective and we’re going to dive into that today, but hey, before that, tell and toast. Tell us what you’re drinking and what are you toasting for? Partha, how about you start?

Partha :
So I’m a no alcohol guy, so I have orange juice in my hand and I’m toasting it for my wife, actually speaking. So I think if I’m something in the industry today, if I’ve done something really good and if I’m learning and I’m having a good life, it’s probably because of the support, what I get from my family and my wife. So I want to do that, so that’s what I’m doing, but the other one I want to toast is basically just all about data. I do want to talk about data and I love data. Data is a space which I have started focusing on these days. For a long time, I’ve been in CIO type of a guy and moved into the world of data now. I want to toast for data primarily because data is the new oil like people have been always talking about, and the amount of data we are creating these days and the impact we can make in industry is amazing. So I want to toast for it, because without data, I don’t exist, so thank you. Thanks to that data.

Tim Gasper:
I love it.

Juan Sequeda:
All right. That’s a good one. A toast to your partner, your spouse, your wife and to data. How about you, Tim?

Partha :
Exactly.

Tim Gasper:
I feel like those are pretty good things to toast to right there, to my wife and to data.

Partha :
Exactly, that’s my life, man. That’s my life.

Tim Gasper:
That covers a lot of it right there. I’m actually drinking a beer margarita. My first time I ever made it.

Partha :
Beer margarita.

Tim Gasper:
And I know Juan, you and I are doing beer themed cocktails today because of Oktoberfest and that kind of thing. You know what? I feel like there’s a reason I haven’t been drinking beer cocktails.

Partha :
I love margaritas too, by the way, just so you know, Tim.

Tim Gasper:
Awesome.

Partha :
[crosstalk 00:03:25].

Tim Gasper:
Margaritas are always good. Adding the beer to it, a little weird, but I’ll try anything once.

Juan Sequeda:
One of my favorite beers is a good Stella Artois and I was looking up for a cocktail and it’s raspberry cucumber spritzer, and there’s gin in here and some Stella, but rather let’s go, I also toast to my wife and toast to data because if it weren’t for that and my family support, that’s my livelihood right now, so cheers to that.

Partha :
Cheers.

Juan Sequeda:
[crosstalk 00:03:55]. So we got our warm up question today, which a very specific one and I think it’s tailored to me, which is that, can you be both ethical and a fan of the morally dubious Seinfeld? You realize I’m a huge Seinfeld fan and I think every single episode of this, of Catalog and Cocktails, I always have some Seinfeld t-shirt, you can see right here.

Partha :
Yeah, I see that.

Juan Sequeda:
So I’ll take this and I say, yes, you can because there’s a historical context around Seinfeld and I think there’s a comedy behind that and we don’t have to take things so seriously. That’s my answer to that. I don’t know if you guys are-

Partha :
I can’t agree more. I can’t agree more, but I think the key message on this one, and I know that’s the topic of the day today is being responsible. Being responsible with the data, what you have, because you have a lot of responsibility in the way how you use it, but with that said, like you said, can you have fun with that? Yes. Can we innovate with that? Yes, and that’s the whole idea of bringing these two together. There’s always a balancing act. So yeah, completely agree with that.

Juan Sequeda:
How about you, Tim? What’s your answer?

Tim Gasper:
When I saw this question, my thought was to be ethical is to be honest. That’s the theme of the show. Honest, no BS.

Partha :
There you go.

Tim Gasper:
What could be more honest than a show about nothing?

Juan Sequeda:
Awesome, man. Love it. So to Tishu who’s RVP of marketing who gave us this question, thank you. I love it.

Partha :
I love it, Tim. By the way, did you make this t-shirt with the honest no BS thing? That’s nice.

Tim Gasper:
I did. I did. It’s one of a kind. At some point we’re going to open our merch shop, so stay tuned.

Juan Sequeda:
Stay tuned for that. Well, all right, talking about being honest and ethical and everything, let’s go dive into our discussion. So Partha, honest, no BS. What does it mean to be responsible with data? What does it mean to be ethical with data?

Partha :
So I think the key message here is, coming from a company and with multiple companies that I’ve worked, which primarily works with data, when we talk about data, we’re talking about petabytes of data, information about a lot of things, about people, about things, about companies and et cetera. The reason why we have so much data is because we want to make some sense out of it and make some valuable insights for our customers, partners and et cetera, who can use the data, but when we take that, the most important piece, what I always talk about is about being responsible with that data. So when we say that, “What does it mean?” I give you something and it was given for a particular purpose.

Partha :
Are you using it for that purpose? Are you taking the data and not being responsible? Because the data, what you received was never yours. Somebody gave it to you for a purpose. The fact that somebody gave it to you for a purpose actually comes with some responsibility that you had to protect it, not only for yourself, but also for the person who gave it to you because you’re doing it on their behalf, so that responsibility is very critical when you play the role of a data steward.

Partha :
This is pretty much pretty standard. At the end of the day, if somebody, one of your neighbor gives you something and lends you something and you’re using it for some purpose at home, because he said, “You know what? Use it,” you just cannot take that and assume that it’s yours and start giving away to your friends just because you have it with you. That’s the way I see it as being responsible with data, is extremely important, because every single company today, and I talk about all the big names, what you hear in the news every single day. Today, Facebook got fined of like $38 million [inaudible 00:07:34] and et cetera. Every day, there’s some fine going on with Facebook or Google, et cetera, primarily is not because of privacy and responsibility of the data and how they handle it.

Partha :
I’m not saying they’re not responsible, they’re extremely responsible, but with that said, there’s always that catch of how we are managing it and I think that’s where I see it as an item which we need to keep an eye on from an industry perspective. We’ll talk more about what my principles of responsibility and how we can be ethical with the data in a few minutes, but I just want to give you the high level view of that.

Juan Sequeda:
I love what you just said here. It was like, if I give you something for a purpose, are you using it for that purpose? Responsibility is to make sure that you’re using it for that purpose. If you got the data just for action A, well, that’s it, you use it for action A, and if you want to go use it for action B, Z, whatever, well, that was not I gave you permission for that. Ask for permission, maybe I’ll give it to you or not, so that’s a really good framework to think about it. I like that.

Partha :
I break it into three major bucket. When somebody gives you data, the first part is when they give it to you, you have a responsibility to protect it, so that’s called the security and protection and all that. You got to protect it. They give it to you for a purpose, which is where the usage rights comes in. What did they give it to you for? They gave you something and you can use it for a particular purpose, so that’s the purpose. The third part is sometimes it comes with some level of respect and confidentiality. They give it to you, but they don’t want to tell everyone that that is what you’re using it for, so that’s the confidentiality, because sometimes Tim may have given me some thing and he said, “Partha, use it for this purpose,” but just because he gave it to me and I’m using it for that purpose, I don’t have the right to go tell everyone that I’m using it for this purpose of Tim’s data.

Partha :
Now I’ve crossed the line of that’s where you get into privacy and confidentiality and all that. So that’s very important in the industry to understand is that we always talk about, if you really step back a little bit, five, four years back, or even 10 years back, much of the focus has been all about security, IT security, and the world has now evolved from IT security to data security, because the world is so broad not flat, and I think we have evolved from the data security to more about governance and ethics and cetera and that’s where we are right now. So it’s a journey and we are still learning every single day. There’s always something new coming up.

Tim Gasper:
I like that you’re characterizing these three buckets here to think about different areas of sort of governance and responsibility more generally. Maybe it’d be useful to actually, double click into each of these a little bit. Like as you mentioned around security, this is a little bit, some of the original intent around some governance, even though it’s obviously not complete. When you think about security and protection, what are some of the biggest concerns you think about there? Do you think about compliance? Are you thinking about internal policy? Are you thinking about like a lot of these things all coming together?

Partha :
It is. So even when we talk about security, and I use the word more than security, I internally and our CEO and our current company basically calls it as total data safety. It’s all about protecting the data from safety. It’s a protection of data and that encompasses everything. It’s an umbrella of everything what I just talked about. When we take that, and I’ll specifically talking about, I’ll double click on the security aspect when you talk about it, there are so many factors when you talk about from a data security perspective, simplest things like what you do from a perimeter perspective and network and all the stuff, what you do to protect your perimeter, then all about when you have within the perimeter, what are we doing too with the data? Which is the encryption and tokenization and stuff like that, which is how you are keeping the data.

Partha :
All these things, and then also the other factors is how you’re monitoring it. That’s where we into the DLPs of the world and accept that data loss protection. So how are you monitoring it, managing it, governing it? All the aspects goes into it. So to me, when we talk about safety of the data, within the eight factors of data protection, I call all of these things within that. So in that category, when people talk about IT security, which is where the perimeter security comes in, what are we doing with your desktop? What are we doing with your networks? What are we doing with your servers? Are we patching it properly and et cetera? Are all part of the umbrella of security, because it’s all for the same purpose. What is it? We are all protecting data.

Partha :
The only reason we are always telling the bad guys, we’re telling them we are going to protect it is what are they coming to the company? What are they trying to break in for? Only for the data. That is your asset. That is the asset most of these companies have. You go to the banks, you go to the insurance companies, you go to many of the companies, that is the one asset what they can. It’s not a real stuff. It’s the data. So how do you protect it? I think these many aspects, how do you go on that?

Tim Gasper:
So how does purpose then become a separate thing from security? You point that out as something that’s unique. Why do you pull that out? I think that’s a unique perspective and I think I agree with it in terms of really leveling that up to a first class citizen.

Partha :
Like I said, security is an aspect of you managing something somebody has given to you. Data has been given to you and you had to protect it. So you are ring fencing the data and protecting it so that it doesn’t get lost and you can give it back to the owner of the data when it is needed or you’re keeping it on their behalf. That’s security. Usage is about taking the data which belongs to somebody else. It could be yours or it could be third party, it could be another vendor, it could be a client who are feeding us information, taking that data and using it for a purpose. So when we take the example of a purpose, Facebook, Google, LinkedIn, many of these companies are collecting a ton of information based on their rules, what they have said.

Partha :
They have said that this is what I’m using this data for and they said, “I’m going to use this data to give you better customer experience. I’m going to do this to cross sell something. I’m going to use it for marketing.” Some may say, “I’m going to use it for risk.” Some may say, “I’m going to use it for underwriting an insurance,” but if I took the data which I said I’m going to use it for underwriting risk and took that and gave it away to some company for marketing, that was not the purpose. I never requested that, because people will say, “No way. I don’t want you to give it out to somebody who will start marketing this stuff.”

Partha :
That is why the purpose is important factor, is that you have to segregate the two. One is protect it, another one is make sure you understand the usage rights, respect somebody’s rights of their data, because the data belongs to somebody and you have to respect it and I think that is very important as part of this. Everyone has their own right. That’s the way I look at it. Makes sense?

Tim Gasper:
Yeah, it makes a lot of sense and I like that framework. I think some companies that are a little bit less mature in they’re thinking around purpose may think of just like GDPR, for example, where they’re like, “Oh consent,” like there needs to be some sense of purpose. They’re thinking within that narrow context, but you’re definitely thinking of it from a broader context and I’m curious, when you think about data responsibility, data ethics, our topics today, is purpose something that you should elevate to early in the process when you’re defining your principles and your policies as an organization, as part of what is our data policy, is purpose part of that conversation?

Partha :
Very much, very much. I think you said it really well. So one of the things, even when I took my responsibility in my company and the first thing, and I recommend that to all the participants on the call, is that the first thing is we all need to have a clear policy and a principle of a company, to say how are we going to be respecting? You go to most of the companies, data companies, you will see what their policies are. They posted publicly and say, “Here’s our policy. This is what we stick to. This is the rule. This is my constitution, which says that I want to be a very good steward of your data, I want to be a trusted custodian, I will protect your data, I will govern on your data, I will ensure that your rights are protected, I will follow all the regulatory rules and compliances and et cetera, policies, GDPR, CCPA and et cetera, et cetera, et cetera,” all of that are what we would put it as an umbrella of this is your policy or a principle.

Partha :
That’s one. In addition to that, culturally, the company can also extend, which is what I recommend everyone in the current world is to be more responsible with your data, which is where I was talking about responsible usage of data, which basically means, and we used a lot of terms about responsible AI and responsible data, ethical AI, it’s all been used all over the place, by the end of the day, the purpose is all the same, is being responsible with somebody’s data and the usage of it and making sure that we are transparent about it. We’re impartial about it. There is no discrimination there and things like that are the one what we think about.

Partha :
I call it as, if you had to easily remember things, I use the word SPIRIT, S-P-I-R-I-T. So effectively, this is my theme for SPIRIT for being ethical AI or ethical usage of data. So when I take the SPIRIT, S for security, P for privacy, I for inclusion and impartial, that’s basically I, R is for being responsible and obviously the I was the inclusion, and then the T is basically, because we have these so many words to remember, easy to take. That’s the SPIRIT of company. What is the SPIRIT of your company is basically to protect the data and I think if you want to use it, feel free to use it. So you can easily remember some of the key principles of ethical AI.

Juan Sequeda:
Well, let me get this right, SPIRIT, security, privacy, inclusion, responsibility, reliability, impartial, then transparency?

Partha :
That’s correct.

Juan Sequeda:
Then I get the SPIRIT?

Partha :
That’s correct.

Juan Sequeda:
Just-

Partha :
If you got that, you are basically making sure that you are very ethical and you’re responsible with the data, so that’s a SPIRIT. What is the SPIRIT of the company that’s being responsible? That’s the word. Easy. It’s easy to remember. So I just said, you know what? Let me come up with a word, what tool, and SPIRIT sort of resonates to the culture of an organization so you can link it together.

Juan Sequeda:
That’s brilliant. I have to say this. This is the experience talking. I really, really like this. This is very key take-away.

Tim Gasper:
I know my next t-shirt idea.

Partha :
Good, I know that. I’m getting that t-shirt, by the way. When you make it, send to me.

Juan Sequeda:
Let’s dive into the other one, confidentiality.

Partha :
So let me get…

Juan Sequeda:
And I think-

Partha :
Into the confidentiality. As for businesses where we have B2B type of activities, where somebody’s giving you data, and they don’t want to say that they’re giving you data because there are many competition that happen, there are many competitors. So if I got it from somebody, so it’s an example would be that you have a neighbor, you went and borrowed a hammer from somebody and they gave it to you, you have a response. You said, “I’m going to use the hammer in my house to nail something,” so you’re going to protect his hammer because it belongs to him. You told him that, “I’m going to take the hammer and I’m going to nail something on the wall,” so that’s the usage. You can’t take the hammer and basically go bang something else like a stone which may break the hammer, because the brass hammer has a certain value, it can only do certain things.

Partha :
It can be nail, but you can’t use it for some other purpose. That’s the usage. Now, the third aspect is confidentiality. The person who gave it to you, you don’t have to go tell everyone in the world that I got it from the person, because the fact you said it, all the others will go and get it from that other person and say, “Can I get your hammer?” They don’t want to tell you that? Because that’s the confidentiality within two people. So now you’re breaking that trust and that’s why the confidentiality comes in, is to say that, “Yeah, I know the guy. He’s giving it to me.” Doesn’t mean that just because he has it, everyone else will go behind it. That’s what marketing is. If you really apply the concept of marketing, it’s between two parties. I like this, you got it.

Partha :
But when you take the data and give it to somebody else and somebody starts sending all these marketing materials to you, which you never wanted it, but they’re pounding on your door saying that, “Can you buy this? Can you do this?” They solicit all that, now you’re not being responsible, unless you’ve got the right to say that, “I got the hammer from you, buddy, but I’m going to tell the entire world that you lend me,” that’s fine if you told them. But if you never, then you broke the confidentiality. That’s sort of an easy way to understand it.

Juan Sequeda:
No, this is great. I was going through these three things, security, purpose and confidentiality, and kind of come up with an example. Mine was like the car. Actually, I don’t have a car. So I asked my neighbor.

Partha :
Great.

Juan Sequeda:
I ask my neighbor, “Can you lend me the car?” And he’s like, “Yes, I will lend you the car.” Obviously, so from a security perspective, I’m going to protect that car. I’m not going to-

Partha :
Correct.

Juan Sequeda:
From the purpose, I asked the car so I could go get groceries or whatever. So I’m going to go use it for that, which means I’m going to use it for the day and spend a 10-mile drive. I’m not going to go off until next week and drive for thousands of miles.

Partha :
Exactly.

Juan Sequeda:
But confidentiality is if my neighbor says, “I’ll lend it to you, but don’t tell people I’m lending you my car because otherwise, so many people are going to come to me and I don’t want to go through this. I’ll lend it to you because you’re my neighbor, you’re my friend.”

Partha :
Correct.

Juan Sequeda:
Then I will keep it. I like these three things again.

Partha :
That’s it. That’s it.

Juan Sequeda:
Security, purpose and confidentiality, the three buckets.

Tim Gasper:
I like that example a lot too. That helps illustrate it in a way.

Partha :
Actually the hammer, I’m going to steal your use case.

Juan Sequeda:
Partha-

Partha :
Your car example is much better than the hammer one.

Juan Sequeda:
Honest, no BS. My car example was better than your hummer example.

Tim Gasper:
Hey, I like both. I like both.

Juan Sequeda:
No, if I had to choose one, I’ll choose a car one, not the hammer one.

Partha :
No, I’ll choose a car one too, man. I like it. I said it.

Juan Sequeda:
Okay, good, good. This is great, but as you know, Tim and I we’re taking notes, but we got so many notes here. Tim, you got some stuff you want to go follow up here.

Tim Gasper:
Yeah, well, I’m thinking about this idea of data responsibility. One of the things I’m thinking about is like, how are you measuring data responsibility? How do you know that you’re doing it, approaching it the right way from a measurement standpoint? Well, let’s start with that. Like how do you think about that? Is it even measurable or is that the wrong way to think about it?

Partha :
It is, it is, it is. Actually, you are bringing up. There are industries which are coming up with that too. So there are different aspects of measuring that responsibility, and you got to break it down into different categories. So when we take the example of responsible AI, I’m going to use that as an example, and I’m going to connect with the rest or ethical AI. You can we can mix and mix the words here. The intent of it is to basically run through some sort of a framework. The framework basically says what is the purpose? What are we doing with that? What is the scope of that work? What is the type of activities you’re going to be using it with the data? And et cetera, and you test it with that. So it’s a typical framework for that one and usually companies will come up with framework.

Partha :
As a matter of fact, what we are hearing now from the regulatory bodies, recently Colorado came up with that, they actually want you to create a framework for this one and they want you to run through all of this and show it to them as approved saying that you’ve gone through this framework. It’s like a control. All of us have internal controls in our company for financials. Very similarly, you want to apply similar type of a control, which is a framework for risk management and ethical use of data and they want to see how you are testing it. What are the criterias and how are you testing it? Show me the result and compare it, and they want to make sure that it’s auditable someday. So there are frameworks being talked about in the industry.

Partha :
Now, if you ask me, is there an industry standard framework? I don’t know. I have not seen an industry standard framework, which would work for a lot of people for everyone in the industry. I would love to be part of that, whoever is thinking, so that we can all bring some into that. If somebody has an idea, who’s listening to the conversation, if there’s any, I would love to hear it from them and maybe they can share some ideas to that. But at this time, I have not seen, but we do have frameworks. Internally companies have been creating those, so that’s one way to measure it. The other factors, so that’s more from that usage, but the other aspect of protections. So what I talk about data protection, data security, the usage, and et cetera, there are very strict guidelines.

Partha :
As a matter of fact, even my own company maintains a list of scorecards and KPIs for that one on how we protect the data, how we manage the data, how it is secured. We have extreme level of tracking, which we do it and that is monitored pretty closely. So I do recommend everyone to create some sort of, because I think what’s a very good point what you brought up Tim is that we can talk all about this, but in my world, if you cannot measure and you don’t measure it, it’ll never happen. You got to measure it. You have to have a KPI, and that’s the purpose of a KPI, is to execute and see the result of it. So I strongly recommend everyone on this call also who are listening if they have ideas, but my recommendation certainly is to create some KPIs and start measuring and create a framework for that one.

Juan Sequeda:
So where are we on that? So there is none or are we-

Partha :
Well, there are many. I wouldn’t say none. There are many, the problem is, everyone has their own framework on how to do it, everyone has their own KPIs to do it. Instead, could we have a common way of doing things? Can the industry have a common way of doing this? Now, if you look into like for example, NAIC has particular framework called security and things like that. You may have heard about it. So you’re seen all of these companies coming up with ISO standard for security. We have always heard about it. NIST framework for security, these are standard industry framework for that one. I expect those type of framework coming up for responsible AI and all that, where somebody’s going to come together as an industry, start putting. There are industry standard principles being laid together.

Partha :
As a matter of fact, I was recently with a meeting on the MIT CDQ. So effectively, the MIT CDO meeting and we talk about this. This was one of the subject, and most of the CEOs in industry, people like me, we all talk about this and there is some framework for that one. It’s pretty consistent. Everything what I talk about except the word SPIRIT, I can’t find it, but if you look into the six themes, every single company will use the same theme. It’s just that it’s easy to remember when I say SPIRIT. That’s about it. But if you take that, everyone has to do the same type of a measurement, if we can come up with some mechanism to measure it and provide a maturity chart, like we all do it for this framework, ISO framework and AIC frameworks and et cetera, I wish we have one for the industry too.

Juan Sequeda:
So this is reminding me one of the conversations we had, the first episode we had of the season with DJ Patil, and the US first chief data scientist. We were talking about ethics too and he had this thing about we have checklists for so many things. What is the checklist for ethics? So it seems to me like one of the things that we should think about right now is this whole security, purpose, confidentiality. Think about the SPIRIT. Anything else that you would add? I can imagine we are producing data. I’m a data producer or we’re generating a data product, a data set, like what is a checklist that as a producer of that data should go through as like, “Okay, this is ethical that I’m doing to generate this data.” As a consumer of this data, what is the checklist I need to go through to make sure that I received ethical, responsible data and I’m using that. Just what do have on the checklist?

Partha :
Very good question. Very good question. I know one of the things, what we have always said, and I think it goes back to the six themes, what I talked about in the SPIRIT, there’s one item, which is all about transparency. So let’s start with that first statement. When I’m collecting the data, as part of the transparency, we want to say that I’m taking this data from you. So factly, if I’m going to be the recipient of the data, I want to be extremely transparent to say, “I want all of this data. I want the data about Tim, I want the data about Tim’s personal information, I want to know his habits, I want to know his hobbies, I want to know this,” that’s the information I’m collecting. I’m going to be extremely transparent. That’s the information I’m getting it, all of these websites.

Partha :
You see all these websites with cookies and other information, they don’t tell you exactly what they’re covering, unless you read every single fine print, but it is there somewhere. All the disclosures are there, it’s just that it’s not easy. But if you start reading those disclosure, you can actually start seeing what is the type of data they’re collecting? How they’re using it. Sometimes they do write it in a very vague way, “We collect personal information for this purpose.” Okay. Tell me. I don’t understand what you mean by that, because, for a layman, they don’t understand exactly what you meant by that line item, “I collect personal information,” so what type of personal information? For marketing and customer experience purpose. So that’s a very broad statement to say how you’re going to be using it, but you made a statement so broad, and it’s very difficult to defend from a legal perspective, because everything can be thrown into that.

Partha :
It’s like, you know what? [inaudible 00:28:26] into that, but being transparent is to say that I am taking this data and I’m going to be using it exactly for this purpose, which is basically like the example of a car, what you said, is I’m going to be taking your car and I’m going to be driving it in town no more than 500 miles. That’s about the max I’m going to do. You made it very clear rather than saying, I’m going to take your car and keep it. Great.

Juan Sequeda:
Let me-

Partha :
So how do we measure it based on that criteria?

Juan Sequeda:
Let me dive in here. So are you advocating that when we talk about purpose, we should strive to be as specific as possible when it’s a purpose?

Partha :
Yes, yes, yes. I would respect that. Again, this is an industry theme. It’s a tough one, because you know what? I’m seeing this statement, but from an industry perspective, it is very difficult, because many times what happens is many of these companies don’t have a view of what the purpose is going to be in the future. Use cases are unknown. They collect the data. They’re sitting on an abundant amount of data where they have said a broad statement that, “I’m taking this data and applying it for some customer experience.” The big companies like the Googles and the Facebooks and et cetera have collected the data, they kept it with some broad statement and they keep creating newer products and innovating it, but those product never [crosstalk 00:29:38].

Tim Gasper:
So the more vague they can be the better, right? Because then-

Partha :
So how can you basically now say you can be specific? So it’s a very tough one to crack, but I just want to say that if you can be extremely specific, and that’s what humans are asking. Right now, that’s what many of us are asking, saying that, “You know what? You’re collecting so much data.” So all these companies, what they’re saying is, “Okay, fine. If you don’t want to give it, we are giving you all these privacy settings in our product, you decide what you want to give. If you don’t want to give, shut it off,” but the problem is when you turn it off, you’re not getting the type of experience you want. If I go to my teenage daughter today, she doesn’t care what she gives up. I care what I do it, but she says, “You know what? Fine. I don’t care.”

Partha :
She checks everything yes. Everything in the yes box and they just want give it away. That’s the human mentality, not realizing the impact, because nothing in the world is free. Anytime I see the word free, it’s basically means you are the product. The person who is using it is the product. That’s what they’re buying, because you gave it away free. Nothing is free, and that’s the most important thing. So going back to you and your question about being that extremely transparent and seeing exactly what it is and measuring it that that’s what you’re using it for is a metric number one. That’s what I’m talking about. That’s the can we get that? Yes. Does it involve a lot of work from the industry and many of the participants? Yes, but that’s a tough one to crack.

Juan Sequeda:
This is really interesting because when we go into defining what is a data product and who is the owner of a data product, what we need to understand this data product here can be used by these people and this marketing department for these specific purposes. That’s the stuff that we need to be very transparent about. So the owner of a data product needs to make these things explicit and we need to have these controls around that. If we’re not, then we’re not being responsible. So I think when we start thinking about treating data as a product, this is a fundamental aspect that we need to associate that. It’s not just about here’s data and here’s the quality about it and there’s no nos or zeros or whatever. It’s like, okay, where does this come from? What can you use it for? Who can go use it? For what amount of time? And so forth. So this is very, very important when it comes to treating it as a product.

Tim Gasper:
I think that there’s a little bit of tension that can happen in use cases like that, marketing use case, between data responsibility and data ethics.

Partha :
Correct.

Tim Gasper:
You can be responsible for the data and be very poor at the ethics around the data, or the fairness around the data, which I know is another topic, these sort of ethical versus fair. Two questions is, one is how do you foster data ethics in addition to data responsibility? And then do you have any thoughts on fairness versus ethics?

Partha :
So yes, very much. So let me put it this way. So I consider fairness to be part of ethical. So being fair is being ethical. So it is a part of it. If I remember, if you remember in the SPIRIT list, one of the things, what I talked about is discrimination or indiscrimination, impartial. You don’t want to discriminate people, which is where the fairness comes in. These days we are talking about fairness, which are based on race, gender, religion, many factors. You don’t want to discriminate people based on many of these things. That’s the first one. Now, there are two things, when we talk about, and here’s where it becomes a broader statement, that’s why I said when I use the word total, and then when I talk about responsible, being responsible is about how you got it, what your usage rights are and doing it with certain purpose.

Partha :
Being ethical is a step above that, because now, in addition to that, you have your own ethics, which basically says, you want to be fair, you want to be inclusive, you want to be transparent. So effectively, the fact of transparency when I get into that is exactly the point where I was going earlier, is that you can be responsible saying, “I’m going to take your data and I’m going to use it for a lot of purpose.” You’ve done your part. You said you’re going to take it, you told them you’re going to use it for many purpose. I can’t explain and I’ll use it, and being transparent is being specific, and not only being specific, but also coming back and proving it that what you said is what you’re doing it. So if you have a right, and you said that, “I’m going to use your data for this purpose,” we had to go back and say, “These are the five use cases I’m using your data for exactly.”

Partha :
And that is why you have all of these privacy rules that have come up, CCPA and et cetera, where it says they have these rules, which is forget me from your system. So in fact, really they have an ability where a customer, a user, an end user have the right, has the right to go back to a company X and say, “I’m telling you to take me out of your system. Completely erase me out of the system.” How many of you have done this on this call? How many of you have tried that process of taking yourself out of the system and saying, “Forget me from the system.” It is not easy. It takes time. Just trying to unsubscribe on an email, just junk emails which we continue to get, look at that, it is such a painful process because you unsubscribe and there are so many different sub-business units they’re done.

Partha :
So the unsubscribe is only done for one portion, not for all. It keeps coming. They tell you it’ll be taken care after 48 hours to two weeks sometimes. How do you track all that? Who has the time to do it? So what do we do? We give up on it finally and say, “Let it go to the junk folder. I don’t care about it. Let me change the email address.” That’s what is happening. Is that not painful? That’s very simple in an email world when you get subscription, imagine if your data has been used in so many different places, matter of fact. That’s pretty crazy, pretty shady [crosstalk 00:35:28].

Tim Gasper:
Well, and many of us have become quite numb to this, so maybe that makes data ethics even harder because we’re like, “Well this is the standard. This is normal.”

Partha :
And what happens with all of these things is the amount of power these technologies have because of the data what they have, the power they have to influence you with a particular decision. Human brain works that way. The more you see certain things, that’s the only brand where you recognize. When you buy something, they know exactly this is what. That’s the only thing, so you just know only those two things. What are you going to do? Our human behavior, how many of us today watch a movie on Netflix and Prime and et cetera, based on a movie, what you know and you looked for it from a search bar? Nobody. They just look into a movie and says, “People like you watch this.”

Partha :
How do I know? Did I go check all the people? But we just go click that button and say, “People like you like this movie.” We click on that movie, we start watching it and we keep adding on to that. I’m not saying Netflix does it, but if Netflix is telling you that’s the movie everyone watches, you believe in that. There is a responsibility that they’re telling you the truth, but how do you know that people like me are the one watching the same movie?

Juan Sequeda:
Yeah, but that may not affect me personally, like whatever.

Partha :
It does.

Juan Sequeda:
Being very practical about it. Like, okay, so what? I just got a movie recommendation, and other aspects I can realize, well, they’re using it for some stuff that is going to make a real impact in my life, but-

Partha :
Well, I think it doubles. So let’s talk about that too. It does impact. So even on Netflix it does in a way, you just don’t see it directly. You see it indirectly, because they’re putting you into a loop on that one, but then you go into the other side of it, which is basically the amazon.com example. You go there and you’ll find that people like you also bought this product. Most of us buy these days based on the reviews. You know that there are many reviews that are false and et cetera, they have a responsibility. I know these companies are really working hard. Facebook, Twitter, everyone, basically with all the fact check and et cetera.

Partha :
Now, they’re trying to say that they’re doing it. That’s part of the ethics. They’re doing it. They’re working. It’s not easy. It’s not an easy problem to stop, but it influences you on what you’re buying, it influences you to how you bundle things, it influences you to do certain things, it influences you to vote to the right person whom you want to vote. So they can guide you to the way you want to go. It’s not based on what you think, it just that you just don’t even know that they’re taking you there. So factly, you’re following based on that influence.

Juan Sequeda:
That’s a very fair point. There is, I think, direct influences and indirect influences.

Partha :
Exactly.

Juan Sequeda:
Those are the things you want to go balance.

Partha :
Exactly.

Juan Sequeda:
So another thing I was thinking about also going back to our conversation with DJ Patil was, the aspect of teaching. He said, “If you haven’t been trained about ethics, you will not be prepared in the workforce.” Are we preparing the workforce of tomorrow with ethics? What’s missing? What’s your call to action for the students listening in, professors in the workforce?

Partha :
I think it is a very good question and I will tell you that we as a company, and as a matter of fact not only we, but the industry is working to be a very good educator on the field because these things can be only influenced only when you know and learn things. The reason I’m saying something is because my parents taught me something, my teachers told me something, and over a period, you learn from that. We have a responsibility to train and educate people. So now the call to action, what I’m going to say is many of the companies, including my company and et cetera, we are trying to innovate and trying to educate, educate our partners, educate our employees. When I say the employees, like I said, every single employee of the company, we actually make it as part of our compliance process on an annual basis to sign, like you have an employee in acknowledgement, which you need to sign.

Partha :
When you join a company, we are making it as part of that. We make it as a commitment. Let us try to drive the change across the industry, and it cannot be done by a company or somebody, it has to be done by the industry. Everyone in the industry has to influence this change, because only then it’ll work, because if there’s no point doing it in an island in one place, what’s happening is because we are part of a large supply chain. I get data from somebody else, somebody else gives me data, I give it to somebody, but if the supply chain problem, if the other person from whom I’m getting the data, they’re not ethical, I don’t know it, that’s a problem. So how do I make sure that the entire value chain is ethical? Which is why it is an industry chain.

Partha :
So what we are doing right now, number one, is companies are taking it very seriously. That’s number one. So there’s a lot of companies who are in the forefront from a data perspective are taking real strong action on that one. If you go to the websites of many of these companies, you will see they’re talking about it. That’s the first one. I did check it up this morning before coming in to see if these things are shown and you will see. Just go to Facebook and you’ll see something related to ethical AI and stuff like that, so that’s number one. Number two, we’re seeing a lot of push from the regulatory bodies, and that’s an important factor. The regulatory bodies are forcing certain change, because sometimes you know what? It’s always good to influence, but sometimes you have to force things. You have to drive a change.

Partha :
Change happens by governance sometimes, like a constitution. So you have to force it. So what’s happening right now is many of the states and in the federal and then many of these are adding these type of request entities as part of that. So now you’re seeing that many of the states, Colorado recently came out with that, which basically talks about some of these things. So many of the states are coming with being ethical and being fair on how you use the data and responsible on that one. So that’s something which you’re also seeing as the second one. The third one, which is an interesting one, you talked about universities. There is some level of call to action on this call. I’m going to say, I don’t know if there are any people who are listening from the universities and et cetera, I do want to encourage them to start thinking about the space.

Partha :
I want them to create a course on the subject. Invite people like us to come and talk and tell why it matters, but the most important thing is because you had to cultivate it at that start before it goes, because like I just said few minutes back, my teenage daughter as an example, for her she doesn’t care it, but they need to understand the impact of it, they need to understand the ethics of it and understand the usage of it and et cetera. So I think we had to cultivate that knowledge to every single person to say that, from your school, from your education purpose, that this is how it should be and this is what ethics mean, and how you need to manage.

Partha :
We all learn in our school right now about how to be a good student, how to be disciplined, but what we don’t know is we don’t tell them how to be disciplined with our data. We have started seeing in industry now, it used to be like a lot of people are going into engineering and doctor field and IT field, and now you’re seeing a lot of people are going into special courses on data and analytics, data scientists and et cetera, many just come up with that. I recommend them to add the component of data governance within that course, because that brings some level of discipline as part of that journey. So I do want to recommend that too. So these are the call to action. Call to action comes from it has to be grassroots. It has to come from all of us listening. We all need to ask the person from whom we are.

Partha :
So we should raise those questions, number one. Two, the companies which are getting our data should be responsible. Three, the regulatory bodies, the government entities, they need to step up and do this. Maybe, you know what? One of the ways they can do that is, you know what? Incentivize people. If you are ethical, then I’m going to give you some tax benefits, simple way. Money speaks. If you do this, we will give you some break on your part, but the fact that you said it, companies will say, “I’m going to focus on it,” because much of the problem, why many companies find it difficult to do this is not because they don’t want to do it, they have other priorities, they had to make the revenues and et cetera.

Partha :
The fact that you gave a tax benefit and you put some sort of a governance with that, so what you have done is basically you’ve given them a break on the price to earn that money. At the end of the day, it does it good for everyone. So go ahead and do that as another means to do it. So those are sort of my thought process and obviously the universities I think I strongly recommend. So I’m happy to be part of that community and come and talk and give speeches to these universities and all that and talk about why it matters.

Juan Sequeda:
Yeah, just a quick shout out.

Tim Gasper:
I love that.

Juan Sequeda:
One of my colleagues and friends, Professor Julia Stoyanovich at the NYU, she actually has been pushing a lot on data responsibility and she has courses on data responsibility at NYU. She’s actually giving the keynote at the International Semantic Web Conference in a couple of weeks about this. So I think this is definitely a topic that, from the academic point of view of research and education stuff is something we’re seeing. So I’m glad you’re making that call to action. Shout out to Professor Julia Stoyanovich that we need more stuff like that. So I’m really glad you’re making that call.

Tim Gasper:
Yeah, and I think you’re-

Juan Sequeda:
I think there was one more thing you want to touch, Tim, right?

Tim Gasper:
Yeah, I think there’s one more thing. I really like your roadmap there, Partha, for the education institutions and how they can do a better job. Last question here before we do our lightning round is, I’m curious if you can paint a bit of a roadmap for organizations that maybe are early on their journey, how do they start to set the right foundation for data ethics and data responsibility? Do you have some recommendations?

Partha :
Yes, certainly. So the first thing, the fact is the most important thing is people, the call to action will be take this as a serious topic in your company and take an action on that one. The fact that they are talking about it, and this should be a topic of the CEOs of the company. This has to be a very important agenda item for the CEO. I personally will tell you that I work in a company where I’m blessed to have one of the best CEOs in the world, and our CEO is so much involved in this subject and he believes because his key focus is all about protecting the data, ensuring that we are ethically right and we are doing the right thing for our customers and being their trusted custodian. I think when you come from that, it automatically percolates in the organization.

Partha :
Many times, what happens is some of these conversation happens more at the level where they may not be able to make the impact across the organization. So this should be the topic at the management level. So make sure you have the right level of sponsorship. That’s the important one. And if somebody needs help on educating your CEO, happy to give some tidbits on that one. But in my case, my CEO was the one who educated me, so I got education from somebody else, but the fact is it’s important. So it’s important to get the sponsorship and knowing that it is important because you care.

Partha :
This one is all about caring about the community and the people whom you serve. If you take that in that context, you will try to do this. Everyone will do this. That’s the first thing. The second part what I would say is that make sure that you create a policy or a principle related to this. You don’t need to start from scratch, just Google it. There are so many different policies and principles which are available, which are pretty standard. Even today I talk about the SPIRIT as an example. You can use that. You can go to basically the MIT CDO group and then you’ll find out there too. So it’s part of the MIT CDO Symposium. You’ve got that too. So you’ll get those materials. That’s the second one I would recommend created that.

Partha :
Third one is basically create a core group under the sponsorship of the CEO and obviously a CDO, that’s the person who’s driving it. But more importantly, create that core group to drive this change across the organization. And then after that, like I talked about educating it, putting an execution roadmap and driving the chain. These things are not easy. It takes time. So don’t expect things to happen overnight, but the fact that you have put a roadmap, the fact that you’re going to be taking action, the fact that you’re going to be having a KPI to measure it, the fact that you’re going to be monitoring the progress will help.

Partha :
It’s not only going to help you, it’s actually going to help and the regulated bodies are coming behind it. They’re going to make you do it regardless. So you may be too late, you won’t have enough time. So jump on it right now. This is the prime time. Get on it. That’s my view.

Tim Gasper:
That’s awesome.

Juan Sequeda:
I appreciate Partha, and this is a quality I’ve seen of leaders who I admire, that they’re very specific, like one, two, three stuff like that, and you’ve been like that today in this conversation. Thank you very much. I want to go listen to this because you were very explicit about the things to go do. I admire that. I learned so much from you. Thank you. This was great.

Partha :
Thank you, and thank you, Tim. Thank you both for having me, because I think it’s an opportunity for me to share and also learn from people, if there are ideas which came up as part of this conversation. I’ve been reading some of the messages which has been coming here as our guests have been giving feedback and if there are other ideas people want to share, I will be reading the message boards and learning from you and feel free to reach out to me.

Juan Sequeda:
Awesome.

Partha :
I would always love to hear from everyone.

Juan Sequeda:
So let’s go into our lightning round. We got a couple of questions here, so I’ll kick it off here. You mentioned you can measure data responsibility. Can you measure data ethics?

Partha :
Yes.

Juan Sequeda:
All right. Tim.

Tim Gasper:
You talked a little bit about security and protection. Some people think of that more as an IT concern, some people think of it as more of a data concern. Is data security more part of data governance than IT?

Partha :
So I believe that this is more of a data governance, more than an IT. Specifically I make it as a business problem I think which is very important. I know it’s a lightning question, but I believe that it shouldn’t be seen as an IT issue. It’s a business issue and we should treat it that way, because of the impact it can have. Yes.

Juan Sequeda:
Should the CDO ultimately be responsible for the data? In other words, does the book stop with them?

Partha :
No, the book always stops with the CEO, but the CDO has the responsibility. So the ownership of data stewardship, the responsibility of data, being a responsible data owner should always be the CDO. You should always have an accountable person in an organization to take that. So the answer to that is yes, CDO, but it is a cloud collaboration between the CDO and the CEOs of the business units or the product owners and et cetera. So I don’t want to think about it that somebody in an ivory tower makes up all these policies and rule and nobody follows. It’s a collaboration, but CDO, certainly the book stops there.

Tim Gasper:
Okay. Well, I like your explanation there. Last lightning round question, I think you’ve already answered it actually as part of our last official question is starting with the data policy and principles, the constitution, a key starting point for the CDO trying to get the program, right?

Partha :
Very much, very much. It helps because you put a framework, you put a culture, you put a goal and a mission statement in that policy then principles. If you didn’t have the 10 Commandments, if you didn’t have the constitution, what are we going to follow then?

Tim Gasper:
Yeah, that provides the foundation and framework for everything that comes after.

Juan Sequeda:
Well, this has been an awesome conversation. With that, it’s time for our TTT. Our Tim takes it away with takeaways. Tim start.

Tim Gasper:
Awesome. I took so many notes and learning is definitely a theme here and appreciate Partha that we could learn from you during this conversation. Two things that I thought I would mention in my takeaways are, first of all, SPIRIT. I think that’s a great acronym and a way to really focus. I hope that becomes more popular. I am going to make that t-shirt and you said it’s security, privacy, inclusion, responsibility or reliability, impartiality, and then transparency. So I think that’s a really great way to think about things.

Tim Gasper:
I really liked your roadmap that you laid out. And to Juan’s point, I thought it was very specific and provides a really great prescription here, which is that, take this seriously, get the sponsorship, make sure that you’re creating your policies and your principles and use that as the foundation, create your core group under your sponsorship with the CEO, execute. Do your rollout, your education, your evangelism and then measure. Measure against those KPIs. So that is a great framework and perhaps will help save some folks some time that are trying to read a bunch of governance books and trying to, “Ah, how do I tackle this piece right here?” Here’s the roadmap for you. So appreciate that. Juan, what were your key takeaways?

Juan Sequeda:
Yeah, so talking about frameworks, these three buckets that you have, the security and protection. When you get data, protect it. You got to monitor it, manage it, understand what the access is, as my neighbor lend me his car, his or her car. I’m going to make sure that I’m going to take care of that car. The purpose, what is the purpose of your data? Use it for that purpose. Create a culture around how to be responsible around that purpose. So if my neighbor lend me his car to go to the grocery store, well, I’m using it for that, not to go on a road trip. Then third, confidentiality. You don’t have to tell everyone if that’s what it was told like, “This is just for you.” My neighbor doesn’t want me to tell everybody that they lend me the car, so don’t do that, because otherwise everybody will ask for that. So don’t break that trust.

Juan Sequeda:
Then you have this, like we’re talking about these checklists of being responsible things is what, again, transparency about what are you collecting? What do you have? And you had a very bold position is we should be specific on the purpose, but this is a balance that we’re going to have to find out, because you acknowledge that being vague enables us to go use the data for unknown use cases and that’s a good thing, but it’s a balance we have to go have and I think that’s something that’s still open to discussion there.

Partha :
Exactly.

Juan Sequeda:
But it’s something for us to think about. Partha, again, this was a fantastic discussion. Thank you so much. I’m going to throw it back to you for two final questions. One, what’s your advice, broad about life, about data, whatever? Second, who should we invite next?

Partha :
Thank you. So first of all, thank you for inviting me. I appreciate it. Thank you for everyone who’s listening too. So I’ll tell you that I learned a lot as part of this conversation, to be very honest, because you know what? When you talk, when you share, you learn, because the questions what you were asking actually reminded me to think few things, which I may not have thought through if I didn’t sit like this for an hour with somebody and talk about it. So I appreciate this opportunity to talk to you and also learn as part of this conversation. As I was sharing what I’ve learned from many of my peer groups and people who have given me a lot of advice on what to do, I’m still learning. So if you ask me the first advice, keep learning. That’s my first advice.

Partha :
The only thing which is constant in the world everyone says is what’s changing. That’s all it is. It continues to change and evolve and there’s so much to learn. So I would say that keep learning and keep your eyes and ears open because you know what? You don’t know who’s going to teach you next, so that’s the point number one, what I would share. So that’s the advice I’m going to give you as part of that. What was the second question you want? What did you have on the second one?

Juan Sequeda:
Who should we invite next?

Partha :
All right. Okay. So I do have, I’ll tell you that my network of CIOs and CDOs in the industry who are experts in this field are so many people right now. I’m going to let them down if I don’t let somebody else name or something, but I’ll give you two names just for the record, but there are so many great people who have been my partners, peers, and advisors, and et cetera. Number one, I don’t know if you had talked to Cindy. She’s ThoughtSpot CDO. She would be an amazing one. She likes to-

Juan Sequeda:
I am a huge Cindy fan. Actually, she was a huge inspiration to go do Catalog and Cocktails and I’m so excited that she’ll be guest. [crosstalk 00:55:50].

Partha :
She is amazing. She would come with a lot of energy. I know we focused today about a sort of a dry subject in the industry, which is more about being responsible being how do we go and security and all that, but the fun subject is more about the innovation. Once I have this, I’m responsible. I can use this, the magic what you do. I love to talk that subject more, but I know I wanted to remind everyone that, you know what? Don’t get carried away by innovation, but forgetting that you’re responsible. But with that said, I think she could be a great asset to bring some of those thinking processes on how to use AI and new technologies in the data world. That’s number one.

Partha :
The other person whom I would also recommend, if you have not talked, is a person by name Mano in Travelers, Travelers CDO, Travelers Insurance. He’s a great asset. He brings a wealth of knowledge in financial services, insurance, and he’ll give you a different perspective on how corporate companies are using data and how they’re leveraging it and the value they get out of that and all that. He’ll give you a different flavor of it from a product company to that type of a company. So I’ll share those two for now. Obviously we’ll talk offline if you need more names.

Juan Sequeda:
That’s fantastic. Just a reminder, Cindy is actually going to be a guest on November 10th. So we’re excited and we will definitely reach out to Rumi?

Partha :
His name is Mano, so M-A-N-O. Mano is the chief data officer at Travelers.

Juan Sequeda:
Excellent. Well, Partha, again, thank you so much. This was a fantastic discussion, and cheers, looking forward to-

Partha :
Cheers.

Juan Sequeda:
… to listening to this episode again, and just a reminder, next week, it’s a special edition of Catalog and Cocktails because it’s Tim versus Juan edition, data mesh. Who’s going to win? There’s so much. I think this is going to be our third episode about data mesh. Data mesh is such a topic everywhere. We’re talking about it and we just need to go talk about it again and have a Juan versus Tim fight. Who’s going to win?

Tim Gasper:
I’m going to be anti data mesh and Juan’s going to be pro data mesh and we’ll get to the bottom of it.

Partha :
Juan, I’m in your camp, man. All right, let’s do it.

Juan Sequeda:
All right. Well-

Tim Gasper:
I must have a lot of haters I think there.

Juan Sequeda:
No, no. I know Shamock is probably listening, so we’re here working together on this.

Partha :
All right, guys. Thank you.

Juan Sequeda:
Partha, Thank you so much. We appreciate it so much and you have a great rest of your day and we’ll keep chatting.

Partha :
Thank you. Bye-bye.

Enter Content Here.