Skip to content

Episode 12: Relating to the Human-Factor of Data Through Storytelling

On this episode of In:Confidence, Theresa Kushner, AI/Analytics Consultant at NTT DATA, joins us to share her perspective on data analytics and how it can be applied to solve real life business challenges. Theresa is passionate about helping companies derive value from data and her unique background in journalism helps her do just that.

Listen now


Tina Tang

VP of Product Marketing at Privitar

Theresa Kushner

Consultant, AI/Analytics at NTT DATA



Intro: Welcome to InConfidence, the podcast for data ops leaders. In each episode, we ask thought leaders and futures to break down the topics and trends concerning it and data professionals today, and to give us their take on what the data landscape will look like tomorrow. Let’s join the data conversation.


Tina: Welcome to InConfidence the podcast for data ops leaders. My name is Tina Tang and I’ll be your host. Today I am welcoming Theresa Kushner. Theresa is a passionate advocate about data analytics and how it gets applied to real life business challenges. For more than 25 years, she’s been leading companies like IBM, Cisco, VMware, Dell EMC, in recognizing, managing and leveraging those companys’ data. Today, she’s the data and analytics Practice Lead for NTT DATA. She’s consulting with clients on how to gain value from data and information specifically on how those companies apply AI and ML ops and overall governance to their everyday challenges. And for her efforts in leading analytics, Theresa was inaugurated as analytics practitioner into the analytics Hall of Fame at Pace University in New York in March 2019. Welcome, Teresa.

Theresa: Oh, thank you very much, Tina. Appreciate the intro. 

Tina: Yeah. And, you know, let’s spend a couple seconds here and just get a little warmed up. Okay. So, you know, these are very deep philosophical questions. Red or white wine. 

Theresa: Red. 

Tina: Okay. It will kind of just for the record what kind?

Theresa: Cabernet. 

Tina: Good choice. All right, Niners or cowboys 

Theressa: so hard for me cowboys. I have to be cowboys. I know. I was a Niners fan for a long time.

Tina: I know. I know. But you know, you’re like me. We have dual citizenship and California’s and Texas right? Yeah, that’s

Theresa: exactly right. Yeah, dual citizenship. I like that. Yeah. Okay,

Tina: here’s another one steak or salad. 

Theresa: Steak, 

Tina: cats or dogs.

Theresa: Dogs. I’m a dog lover.

Tina: Okay, this was more fun, true crime or sci fi. 

Theresa: True Crime. 

Tina: I listened to True Crime podcasts.

Theresa: I love that darker side of humanity.

Tina: Yeah, yeah, though. I am a huge sci fi fan as well. I’m really geeky about it. And I’m not gonna say sorry. There. Yeah. So.

Tina: So, you know, let’s get started. Let’s get into this. What did it mean to be a data evangelist?


Theresa: Ah, such a good question. Somebody? I think it was you that asked me that. And as a result, I wrote a blog on what is the data evangelist.  I did a lot of research about what an evangelist is, or a disciple of some sort. And it’s really someone who is so passionate, they have to tell others. And so that’s why I’m a data evangelist, because I’m so passionate about how you use it in business, that I have to tell everybody

Tina: that’s totally accurate. From what I know of you, that fits you perfectly. So this is a quote that was written about you in “Data scientists and analysts were denied access to the data that they need to build the right algorithms, making sure that your analysts have up to date hardware, current software, and access to data or basics to the success of data analysts.”  What are some reasons that data scientists and data analysts are denied or delayed access to data?

Theresa: I think people are afraid that the data will contain information that they’re trying to protect. Data privacy is a big deal, especially when you deal with things like health care, or health plan or anything like financial, anything financial. So they’re not so quick to decide that a data scientist needs to get in there and play around. And a lot of times access is denied because of that. I’ve worked in companies where HR had that opinion, where you couldn’t go into HR and certain levels and see data, and I understand but if you’re going to do something with artificial intelligence, to make sure that HR is recruiting the right people promoting the right people doing all those things, that AI can help you do then you have to give them access If

Tina: Were there safe ways and doing that, whether there were ways to do that with but still retain the analytical value of that data?

Theresa: Yes. Well, you can mask out salaries and names and just key positions and locations and all the other stuff.

Tina: So would you say that? Is this a true statement? There are no technical barriers to data access only organizational behavioral or? 

Theresa: Yeah, that’s it. That’s about it.


Tina: I remember one conversation we had recently, where you remarked that there’s a lot of data owners out there people who are responsible for some kind of system of record, who have this sort of protective stance around their data, it’s my data. Do you have any, you know, what are your thoughts about that? How are they justified? What is their you know, not a greater good kind of component? Or is it too simplified? It’s not a black or white situation? So Theresa, for the data owners that are very protective about their their data, you know, is is this a matter of them, kind of like maybe standing their ground for political reasons? Or is it not such a clear case? Of No, I don’t want to play ball is there is there more that needs to be realized in this situation

Theresa: it’s human nature, we’ve we have spent years saying that information is power, and data can be turned into information. So if I want to hold on to something, if I’m the finance guy, the CFO, and I want to make sure that finance is the cornerstone of the company, the first thing I can do is harness all the finance information. So and I’ve seen that before, I’ve seen it in that, yeah, they want to do it for the greater good, there’s this thing they’ve got to do for the greater good. But sometimes it’s just too difficult, you know, if you make if you’re gonna make me change my data in order to connect to your data, than what’s in it for me, and that what’s in it for me never ever gets articulated well enough, or doesn’t often get articulated well enough. That’s one of the things that the data people that I’ve worked with, don’t do a lot, we don’t spend enough time building the case for why data is important to everyone we’re talking to. So it’s, I was actually at a conference last week, where one of the guys that used to be on my team is now the Chief Data Officer at a credit union. And they were having a conference where he was talking about the value of data, but everybody in the room understood that they played a part in the value of that data. And he has spent years trying to get them to that point. So kudos to him for doing so. But it is a very long battle. And what we as technologists seem to think, is that we can just apply technology to this in some way, and it will happen. And it’s not it’s human nature. You know, one of the very first questions I got when I spoke at his conference was how, how do I get judged in this program? How are you going to look at KPIs for me? Because that’s exactly what everybody’s thinking, even though they’re nodding and saying how wonderful it would be to have all of this together. They’re thinking, what’s in it for me?

Tina: I’ve heard that before. I heard that from a Gartner analyst, actually, just the other week, he said, that is a key component to any successful initiative. What’s in it for me? How are you going to make me look good. How have the organizations you’ve worked with? How have they been able to overcome any of these types of hurdles? Like, or have they have? They is? I mean, you gave that example. And it took him many years.

Theresa: Yeah, I think they do it, it just is. It’s water torture, you know, a drip at the time, a drip at the time. You know, I think I think that’s what you have to be persistent. That is that is like one of the key things I always look for when I look for data management people. Are you persistent? You know, did you just do one project and back off? Or did you just keep at it? I think that’s one of the key things that you have to learn is that you have to be persistent. And again, at every turn, tell someone why that’s important for them.


Tina: So there’s a lot of one on one connections even right, like on a human level, right? 

Theresa: On a human level. Yeah. And you have to have you have to have a lot of support to do that. My friend that where I spoke at this conference, his CEO was behind everything he did, and and stood up and made comments after his presentation about how great it was. You know, you didn’t have any doubt when you left the room? What was important to the CEO.

Tina: Excellent. Okay. And are you able to share like When this person would evangelize, did they use like a lot of data to evangelize?

Theresa: Interestingly enough? No, because it doesn’t take a lot of data, it takes a story. And it takes a story. And one of the stories I love this story who’s probably going to kill me for telling you this, but one of his stories was about answering a service call. And the person said that her dog ate her ATM card. Okay, well, the whole thing gets mixed up, because the chat bot doesn’t know about a dog eating, he only knows about an ATM card. So, you know, the chat bot gets all confused about it, which is something that happens in the day, day to day world. Yeah. So, you know, it was kind of interesting, because that story sort of led through everything. And if you can find a story that you can equate some of your problems to and data, it goes a long way. I just finished doing some AI governance work for NTT. And one of the things when it came to what kind of team you create for AI, I put a storyteller on every team. Because, you know, they have to be able to tell what happens. Okay, we’ve got this great algorithm, but when I put it in place, what’s going to happen? And what’s the story I’m going to tell you about that happening. And that means it’s a certain sort of communication. A lot of people don’t understand artificial intelligence. I mean, when I say AI, I am quite sure that through everybody’s mind, that image of the Spielberg movie, and the little boy comes to mind, that’s what everybody thinks AI is. But the fact of the matter is, is that AI is around you every day. That’s what Siri and Alexa and you know, all of the recommendations you get from Amazon, that’s, it’s already there. Now, are there some things that can be a little scary about that? Absolutely. In fact, that’s the creepiness factor. We all have to kind of watch that. But it is you’ve got it already in spades. Now, what’s it going to do for your life going forward? And how are you going to put it into your business and incorporate in the right way? And you have to have people tell you stories to do that. 


Tina: And because I already know this about you that our audience doesn’t, which is that you actually have a background in journalism.

Theresa: I’m not a I’m not a technologist. Yeah, but you know what journalism taught me that you ask a lot of questions. And I’m just an avid learner. I mean, I love to learn. And so asking those questions always gave me fodder to go deeper. And I tend to be able to say, Okay, if you can tell me what it is, then I can probably tell anybody. But one of the things I’ve discovered is that data scientists often don’t know how to tell people what they do, you know, they, they’ll take you through random forest, you know, capability. And they’ll talk about shap, and they’ll do everything, except tell you what it is they’re trying to do and what it is they’re trying to accomplish. And that needs to be pretty simple, because the people they’re talking to, don’t have that kind of background. 

Tina: Right, So it comes back to the human factor. And humans communicate best through stories. Exactly.

Theresa: More importantly, than communicate best they remember best through stories, your retention of what you heard is a lot higher if there’s a story associated with it.

Tina: And so interesting, yeah, that’s using our human faculties to influence

Theresa: We’re tribes were tribesmen you know, we might still be sitting around the fire, and talking and talking about what we did during the day. But it’s that storytelling capability that we’ve learned from the very beginning. I mean, I grew up in the South. I couldn’t talk to my grandmother about what she did at the grocery store without getting a story. That’s just the way it is. You got to tell stories. That’s how it happens.

Tina: Well, I mean, in a lot of ways, that’s what social media is, except for the new generations, right, is to tell stories, short stories, very short, 

Theresa: very short stories, 

Tina: but they are stories. 

Theresa: That is true. I just heard something today about the Gen Z generation, about their use of memes, that the use of memes is becoming a really hot thing. So I don’t know what that means. I thought it was bad enough that we had to put everything in 120 characters. Now a meme that last two seconds. That’s even that’s even better. Right?

Tina: But you know, when you just said memes, it, I love memes. I love a good meme. I’m down any day for a good meme. Because it’s like a shorthand for story that we can relate to. And going back to your comment about sitting around the fire. We are still tribal. Right. And so the meme is sort of the modern facilitator of these shared stories. But you know, and speaking of generational and social media for that, for that matter, do you think that privacy is a generational issue, any detail you want to provide about that?


Theresa: I think that the boomers and all of the guys that came before 2000, are probably pretty aware that there’s this thing, identity theft, and all the things that happen on the web. The Gen Z’s don’t really care about that they do to a certain degree, think it’s important. But they also expect that all the apps that are out there, and everything else they engage with will protect them, they have a different view of that technology, then other people do. So they are a little bit more understanding. And what’s happening is that they’re pushing, for example, in finance, the Gen Z’s are pushing banks to do things they never thought they do before. Like, how come you know, taking a picture of your check and depositing it without sending it to the bank? Okay, or being proactive? If I’m trying to save to go on my next trip to Aruba? You know, how does the bank help me do that? Yeah, so those kinds of things they’re pushing, they’re pushing people, right? 

Tina: Pushing, pushing the expectations of experience. 

Theresa: Exactly. And customer experience has been set by not the guys that have the technology like finance, but the guys like Amazon, and Netflix, people that we have a certain expectation of customer engagement with our technology, that Amazon just is set the bar so high, that all of us that are in the b2b world or in the b2c world in a different aspect have to sort of abide by that, we have to figure out how our service is going to measure up to Amazon’s. Or  how our recommendations are going to be as good as theirs

Tina: though I don’t think that the privacy regulations are going away, in fact, just the opposite, right? They’re growing? How do we rectify this, these two kind of almost opposing energies here?

Theresa: Yea, it’s gonna be a really well, one of the one of the ways and I’m I know, this is really hard for people to think about, but one of the ways is to give, it’s like medical records. And the United States is behind this in a way, when I lived in France, you owned your medical records, you were responsible for it. If the doctor asked you what your last X ray look like, you had to bring it to the doctor, they didn’t go to a system somewhere and pull out all of your X ray records and send them around the way they do in the States, you were responsible. Now, if you take that concept and look at data itself, all the data about you should be yours. And you get to decide who can use it, and which portions of it they can use. Now, that creates huge problems for marketers that are depending, for example, on intention data, because people like Amazon and Netflix and all the guys that are looking at making recommendations to you are looking at data from what you click on, what’s your and they’re making assumptions based on neuroscience, which you might have wanted to select or what your intention was when you went to this, okay, that’s fine. But that kind of eliminates some of your privacy, if you’re looking at intentions, okay? Even if you don’t know who I am, you know what I might be intending to do. Okay, now probably is pretty good. If I’m the Tops Supermart shooter, and you can figure out what my intention is before I go. But that’s not the way this is working there. Instead, figuring out what your intention is to buy the pink shoes or the yellow shoes, you know, or which ones they should present to you. That’s where the intentions are coming in. But I do think if we gave, if we gave back to people the ability to manage their own data, we might have an answer to this. Now, that’s not going to be easy. 

Tina: Can we be trusted with our own data when they

Theresa: Be trusted with your own data? Exactly. Can you? I don’t know. I don’t know. There are some companies out there that for example, I know one company who keeps all the information about intentions and actions on your phone and never allows anyone to see that information. But they also allow companies to be able to they they allow the ability for someone to send a message to me about a coffee shop around the corner when I’m nearing that coffee shop. Now they don’t necessarily know who I am. But they would be the beneficiaries of having someone visit the coffee shop. So there’s some technologies where people are actually looking at that, to protect to protect the privacy that people so value. I don’t know if you’ve tried this or not, Tina, but have you tried taking yourself off of the net? Have you tried just saying I don’t want to be on the internet anymore at all?

Tina: I’ve never had that thought. Not only have I not tried it, I’ve never had that thought.

Theresa: But you know, go try and take yourself off LinkedIn. If you can’t, or Facebook, you can’t?

Tina: Well, they follow you anyway. I mean, even if you’re not on their platforms, they still follow you through their affiliation, links, etc.

Theresa: Yeah, it’s kind of scary, in a way,


Tina: but I mean, is how many companies like in Europe, for instance, and now in Singapore, and Kingdom of Saudi Arabia, they have pretty well developed privacy regulations. And so that, you know, companies who are based there, already have that mindset. They it’s an, it’s a, it’s a mindset that they have, they don’t consider it necessarily an obstacle, it’s just the way they do business, the way that they respect personal data, or even not personal data, but sensitive data, like financial information, right? geolocation information. So for companies that are not based or started in those geographic regions, say, like, maybe I’m a company that was started in the United States, and maybe I have an electric scooter that I’m developing, and it’s become really successful in major metropolitan areas of the United States, and I want to take my business to Singapore. Now, my intention is to grow my business, I’m entering a new market, which has different or not even different, they have data sovereignty rules, right? And order and jurisdiction rules that don’t exist in my home market. So kind of like it’s it’s not a compliance driven reason to look at data privacy and data ethics. But it’s a growth based initiative that is driving that need to understand how that translates into my data strategy, including where I store my data, how it’s accessed in, you know, raw form, or de identified form, etc. I mean, our how many companies are going through this right now? Like, have you seen a lot of this kind of

Theresa: so many of them, and it perplexes all of these companies, because, you know, even in the United States, all of the privacy laws are now being done state by state,

Tina: in true American fashion.

Theresa: Exactly. So how do you decide what your and I actually had a guy who I work with, who owns a business in Houston, and he told me when this laws came out for California, for example, he says, I said to us, do you market in California? Oh, sure. Well, are you going to do anything about this? Oh, no, I’m in Texas, you know, they’ll never they’ll never figure that out. I’m going, how is that helpful? How are you? Is that helpful? So I think that we have this concept of what our community is. And when we go to other communities, and they have different laws, we don’t necessarily always understand that. I just finished an analysis. And I have to get it, of how many states have enacted AI capabilities. In other words, how many states are even talking about artificial intelligence and controlling and in some way, over the last three years, only 17 states have done something about it. None of them have said, you know, you can’t do it this way. California comes to the closest in they they requested the federal government to take it on. But nobody’s thinking about that. So you’re going to have this a great story, you’re going to have driverless cars. So in advance of the systems required to manage them, that that’s going to be a problem. Like, what happens if a driverless car hit someone? Whose fault is it? Who does the policeman call when he arrives at the same? You know, we don’t have systems to handle that.

Tina: that has a lot to do with how those systems are trained. When you’re doing the computer vision like model how, what data are you using to train that model? And if the person you know, there is a person behind the training, if they don’t select the right data sets for the training, then that model has a bias, potentially lethal bias


Theresa: well, and leave bias. I mean, it is the Amazon people spent four years I believe, trying to determine an algorithm for selecting resumes. After four years, they discovered that they weren’t making any progress, because they weren’t getting any diversity in the resumes that were being selected, because guess what, they were using 10 years of their last 10 years of their data. And they had not been very selective. So it and they actually killed the project, they don’t do it any longer. Because it is a matter of what is it that you’re trying to collect. And that’s one of the reasons why diversity has come up. In every AI organization, you need to have diversity. And I’m not talking about men or women, or black or white or anything like that. I’m talking about diversity of thought primarily, and mostly diversity in the way you approach problems. Because the more you get people that are really different in the way they solve a problem, the better off your solution is going to be.

Tina: Yeah, I totally agree. And in, in, in my previous lives, and and my current profession, in my current company, Privitar, I’ve noticed that diversity, like you said, it’s about thought in the different your thought process, your approach to solving problems differs from because of your particular experience of life, not necessarily just your gender, or just your skin color, or sexual orientation, or even where you live. But it’s a variety of its variables, a lot of different variables come together to form your life experience, which form your thought process. So that’s it’s such an interesting development that a lot of companies and people who are responsible for building high performance teams are coming to realize that it’s not just about what school you graduated from, or how many years you have been doing this, it’s really much more subtle than that. 

Theresa: It’s making, finding the right people, for all the teams that we need very difficult, because it’s not just finding the right person, finding the right person that mixes with the other three people you have on the team. You know, it’s that combination, that’s really, really critical.

Tina: And I find that’s true in a pickleball. game to you know what?

Theresa: Absolutely. Absolutely. That is right. That is right, it does it makes a difference always

Tina: that one person who’s you know, super serious..

Theresa: That’s a person, I know, Oh, that’s funny. Yes, it’s really important. 

Tina: So Theresa, what’s a commonly held belief about data and analytics that you passionately disagree with?


Theresa: The common held view that data is a project that you can get it all done, you can fix it in one project, and no matter how much it costs $10 million, it doesn’t matter, you can fix it? No, you can’t. That’s my absolute big one is that data is not a project, it’s a process. And you don’t want to if you don’t want to continue it, don’t start it. And that’s my number one, about analytics. The one thing about analytics, and I saw it just yesterday, is that we get much more involved in the way it looks than what it says. And we ought to start first with what we’re trying to say, and then dress it up, instead of create the dress up and put the data in it. And I saw something yesterday, where one chart had three bars, and the third bar was a total. And it was right next to another chart that had the same three bars, but the third bar wasn’t a total. And I thought this is really not the way we’re supposed to be doing this not the way we’re supposed to be doing this. So we don’t think about again, that’s a story. You know, all charts are a story. So what are you trying to say? And we never just back off and go, Okay, what is this really telling me?

Tina: So next question is, there are many challenges and opportunities around data and its use, how would you describe the ideal approach an organization should take to ensure safe and secure use of their data?

Theresa: You know, I know this is this is a hacked response. But collaboration is really, really important in all of these efforts. And to ensure that you have success with data, you have to have everybody understanding, oh, this is what we’re talking about. This is my portion of what we’re talking about. This is your portion and we’re going to make all of this work together. That The arrangement upfront is the most important thing you can do 

Tina: great cross organizational stakeholders, 

Theresa: exactly. People who understand, Oh, I’m an order entry. And if I commandeer a field to put in something I want to track, that’s gonna, that’s gonna hurt the guy who has to ship the product, it’s going to hurt the guy who has to look at sales for that product. It’s going to hurt. Now, there’s this new concept in data. And I’m sort of anxious to see if it’s, it’s Gartner talks about it a lot called Data Mesh. And then the data mesh world, data elements data itself is owned by someone. So that data becomes a product just like your product marketing person, that product is data. And then I start to look at, oh, is my product of a quality nature? Is my product available to the right kinds of constituents? How do I have to change it to make it so? So with that product concept in mind, I think that data could move a lot farther to being something that we use internally. And I think then we would start fighting over who’s going to who’s going to use my data, who’s going to access it, if I can have full access, and you’re responsible for making sure that my access is clean, and that it’s protected? Because you own the data product? That’s probably a good thing. Yeah.


Tina: Yeah So of course, data mesh is the data architecture that was founded by the Zhamak Dehghani, of ThoughtWorks. And that, you know, just her her understanding of the data landscape, and the downstream uses of data, as well as the upstream source. It’s just, it’s phenomenal, and it’s gaining so much traction. I’m really glad you brought that up. Do you think that her concept will bring people to the table? Who were you know, earlier, we mentioned those, you know, those people who are holding it close

Theresa: I think, ultimately, I think ultimately they will, the problem is the people that will bring to the table are those guys who have data lakes, and data warehouses and SQL databases sitting under their desk, it will bring them to the table, but they’re still going to have to understand it’ll take special people who can manage the product of data. That’s where I think we’re not necessarily training the right kinds of people yet. I’ve read her paper twice. And I just I think there’s some naivete, and what can be expected. But I think it’s a great, it’s a great first step to think about what you would do if you actually manage it as a product. Very cool.

Tina: Let’s talk about trends. Well, I guess data mesh is, is one trend. Are there any other trends that you’re seeing in the market in this space?

Theresa: You know, the biggest one, the biggest one and the one that just drives me up a wall, but I think it’s probably right, is the Gartner’s in the foresters of the world are, they’re labeling everything artificial intelligence. And it’s sort of like they’ve created an umbrella, that everything that has to do with analytics, or data or intelligence of any kind is artificial. So therefore, it’s our all artificial intelligence. And you’ll even see in some of the Gartner stuff where they’ll go, are artificial intelligence, AI, ML, NLP, business analytics, they’ll just list everything. And you go, Okay, fine. I give up. You know, I think it’s because AI is such a hot topic around everywhere, that they’re just sort of using that to talk but you get an enforcer to where they talk about automation, and how automation and data intelligence are coming together as well. So it’s becoming a really, you don’t have these strict disciplines in everything anymore. It’s becoming much more a continuum of what am I doing really, with this data? And how am I managing it?

Tina: I mean, speaking of this increased interest and maybe overly broad use of the term AI, one of your favorite, one of my favorite quotes attributed to you is this. I’m just going to read it here. “Our problem is that AI just like a newborn baby is highly dependent on the data. It is fed from transactions and interactions that the model might have. If you teach a young child prejudice, the actions taken by the child are prejudicial. AI is no different. It does unto others what it is taught to do”. So given that statement, how would you describe what is ethical AI?


Theresa: Such a good question, because I just finished an entity we call it trustworthy AI, because assuming that if it’s ethical, it will be trustworthy enough. trustworthy AI, again depends upon something very basic, which is transparency. What, where’s that data coming from? What is that data? What’s the quality of that data? What’s being how’s it being used in the algorithm, because there are two ways that an artificial intelligence algorithm can get off track. One is in the data that’s collected, just as you know, if it becomes prejudicial, then that’s what happens with the algorithm. But the others in the algorithm itself, it can get off whack. And it may be handling the data appropriately for when you first started, but over time, it might not. And one of the things I’ve constantly tell people is that when we’re looking at AI governance, we have to look at the continuum of where AI starts, how the data gets managed, all the way to where I use it, and how it’s being applied. Because that’s where the ethics comes in. You can have the greatest algorithm in the world. But if you’re applying it in a situation that is non ethical, then then who’s what’s the blame? Is it the AI? Or is it your application of it? And so I, I think, you know, I was questioning somebody the other day, and, oh, I sit on the board at the day at data for the data, people at SMU, Southern Methodist University, and I was asking one of the guys, so do you teach a class on ethics? Just an ordinary class? Do you require that the people who are in your class go to an ethics class? Now they don’t lay recommend that they don’t necessarily make that a big deal? That’s probably something that we need to have included in our curriculums for data science. 

Tina: Totally agree

Theresa: The ethics. Yeah, because it is just too easy for a data scientist to skip over that part about the data. You know, they’re not, that’s not their purview, the data engineers becoming much, much more important, because the data engineer is getting that data and managing it. But somewhere along the line, someone’s got to be there to ask, well, when was that data collected? You know, was it at a time when, for example, we collected data on all the men in this category, but we didn’t collect any women’s data? And now we’re going to be making products that are supposed to be for both? You know, how do you know who is the person that stops that? And that’s why the team she put together, having diversity on those teams is so important, because somebody will recognize that that’s not there. And my favorite story, my favorite story about that is that the team that created the first Fitbit did not have a woman on it. And you know how you could tell? Because there is nothing on the first Fitbit that manages anything about a woman’s cycle or a woman’s physiology,

Tina: even the look of the first Fitbit. Alright, yeah, and we all know how that that story is ending. So hey, Teresa, it was a real honor and pleasure to talk to you. It’s so fun. I love our conversations. And I really appreciate you taking the time to come on to this podcast.

Theresa: Well, I am so thankful that you invited me and it was a lovely conversation. I love talking to you, Tina, thank you so much.

Tina: Likewise, thank you. Thank you for joining InConfidence.


Outro: No matter where you are in your data journey. privitar is here to help. privitar empowers organizations to leverage their data to innovate faster, while protecting the privacy of individuals at massive scale. privitar is unique in combining technology, thought leadership, and expert services to help your data operations thrive. Want to learn more? Our team of experts is ready to answer your questions and discuss how data privacy can fuel your business. Visit Thanks for listening to InConfidence brought to you by privitar. To hear more insights and advice on how to effectively use, manage and protect your data. Subscribe to the show and your favorite podcast player. If you liked the show, leave us a rating. Join us for the next data conversation.

Ready to learn more about Privitar?

Our experts are ready to answer your questions and discuss how Privitar’s security and privacy solutions can fuel your efficiency, innovation, and business growth.