Episode 119: You Need a Data Strategy with Immuta's Dan Wu.

Media Thumbnail
00:00
00:00
1x
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Episode 119: You Need a Data Strategy with Immuta's Dan Wu.. The summary for this episode is: <p>A solid data strategy can prevent your company from running aground and turning a huge opportunity into a horrible mess.</p> <p>Dan Wu is our guest on this episode of the <a href= "https://www.georgianpartners.com/the-impact-podcast/">Georgian Impact Podcast</a>. Dan is a superstar commentator in the privacy and data governance space. He’s leveraging his Ph.D. in Sociology and Social Policy and his law degree to help protect people and their data. Dan believes that the best way to do that is through data strategies formed by cross-functional teams that include input from governance, analytics, marketing and product departments.</p> <p><strong>You’ll hear about:</strong></p> <ul> <li>What we can learn from the botched launch of the Apple Credit Card</li> <li>Why every company needs a data strategy</li> <li>How regulation, like the Algorithmic Transparency Act, could add protections for consumers and accountability for business</li> <li><a href= "https://hbr.org/2017/05/whats-your-data-strategy">Offensive vs. defensive data strategy</a> – HBR Article</li> <li>Where responsibility for inaction leading to data breaches should lie</li> <li>Data risks businesses face, including biased algorithms, sharing data with the wrong people, 3rd party data breaches, insider incidents, and technical mistakes</li> <li>Why data ethics need to go beyond what’s strictly legal in order to establish and maintain trust.</li> <li><a href="https://ainowinstitute.org/AI_Now_2019_Report.pdf">AI Now’s 2019 report</a> that touches on ethical inequality risk factors in AI</li> </ul> <p><strong> </strong></p> <p><strong>Who is Dan Wu?</strong></p> <p><a href="https://www.linkedin.com/in/wu12345/">Dan Wu</a> is the Privacy Counsel & Legal Engineer at <a href= "https://www.immuta.com/">Immuta</a>, a leading automated data governance platform for analytics. He writes about purposeful data strategy on TechCrunch and LinkedIn. He holds a J.D. & Ph.D. from Harvard University.</p>
What we can learn from the botched launch of the Apple Credit Card 🍎💳
01:38 MIN
How customers perceive your product is just as important as the truth behind what your product actually is!
01:23 MIN
The key components to developing trust with your customers 🔑
00:36 MIN
When thinking through the ethics of data use you need to think about what will both protect and power users in the long term 💪
01:13 MIN
Warning: This transcript was created using AI and will contain several inaccuracies.

if you read it interesting piece recently and TechCrunch when LinkedIn about a purposeful data strategy a data boost be that get you thinking about how a focus on trust would be advantageous to your company but you might have been reading a piece by Dan Dan moo is the Privacy Council and legal engineer at a mute on a company that provides software to focus on managing access to regulation of and protection of data but that's not enough about Dan you see Dan leverage both his PhD in sociology and social policy and a law degree cuz he didn't think these two Fields work closely together at us he wants to see people and their data protected by Design so today some big ideas that you change your thinking about data stewardship including privacy governance and building trust I'm John trial and welcome to the Georgian impact podcast off

So I love what you're saying. Dan about has to start to see some level has to stop now. Can you kind of look back a little bit and kind of take me through some of the biggest mistakes that you've seen CEOs make absolutely wage. First one to point out that this is an extremely hard job for Studio. We're living in a world where everything is changing rapidly, but it's more reason for CEOs to look around the corner and learn from Best Practices. So on and bring up the the recent Apple issue that happen with their product launch with Goldman Sachs where they're launching a new credit card. I think it's an interesting case because apple is setting itself up wage as a privacy leader. It's really really pushing this idea that it's keeping your data secure private. It's not sharing with other people and highly public arguments with the FBI on the former government yet despite their launch of this product Twitter exploded with concerns that Apple's product was being biased against a certain demographic.

Particularly women and it actually launched a federal investigation and you might argue for Apple who it is a master at product launches. Just think Steve Jobs with this turtle neck and everyone going crazy was brought up is probably one of the worst product launches in a while and it's really showing that if we don't get this right as organizations as competition is apple as familiar with financial regulations. That's Goldman Sachs has even if you're checking all the boxes, you might get angry Twitter users critiquing your product and really even cross a state-wide investigation into what you're doing. So it's interesting because you're right Tim Cook declared privacy. Apple really hung their hat on privacy and he probably sat home at night. So yeah, I think I did a good job. I got this privacy thing, right but he missed something at a much deeper level. He didn't ask is your algorithm biased or not. So does a dog

Yo have to think about a data strategy then hundred percent. I think you're exactly right is that it has to go beyond just thinking about privacy. But what is private for privacy is to protect people and people also need to understand that and so the data strategy. It's really about thinking about not just how do you protect people but use to age data in ways that benefit people in a safe matter and so regardless of whether or not there was bias. I think there's a high chance very likely that there wasn't biased it's that people perceived bias. It's that experienced of doing what the product in the Apple case where people felt that it was bias in the case of the person that complained. He told me a story about how he bought from customer service agent to customer service agent complaining asking, you know, despite having a similar number worth. Why is my wife's credit limit 30 times less he wasn't able to get any clear answers. It was a terrible job.

User experience for him and add elastic effort. He posted it on Twitter that erupted. I think it was shared like thirty thousand times and apple finally responded and so goes to the fact that it ultimately has to come back to building those relationships with users cementing that that trust regardless of whether or not you're checking the boxes. So it's interesting privacy. We understand they're not sure if it does it well, but we understand it and it's it's sort of hand and glove with security. But you did bring it back to trust now is not one of the key foundations of trust then transparency whether it's sharing what's in the algorithm or putting a better script in front of the customer support teams. How does transparency play in this game? That's a transparency as a huge component of it. And that's why there is that Bill in Congress about the algorithmic accountability act. We don't have much transparency in algorithms right now with, New Jersey.

Either the exception being credit scores, but outside of that there isn't really an obligation a legal obligation for people to be transparent about their algorithms help people understand how it works wage and more generally objecting to the results of algorithms in a systematic fashion outside of the US GDP are the general data protection regulation does have walk-ins for that, but it's not embedded. So that's just another example of how privacy is one component of a larger strategy around how you use your data how you communicate the way you're using your data to key stakeholders another component of that as you just mentioned is transparency. I think another component is making sure that you are catching these mistakes early. How are you working with you years to catch? Hey, you know, maybe some people might perceive this to be biased against women or actually, maybe it is biased against women. How do we how do we catch those things early on and build a brand and Communications phone number?

Gee that Fosters and cement set trust you mentioned regulated industries does the industry matter in terms of how high up it is on the heat map for a CEO absolutely dead. So I think it comes in two ways. I remember reading a study I think is by Accenture Deloitte where there are Industries where people are most willing to change who they buy from them and those include things like retail and some other I guess you could call them a little bit more interchangeable Industries finance and Healthcare, it's harder at the same time. It's because they're more highly regulated. There are more controls in place for those industries that create these protections. So for instance in finance, there is an audit that has to be in place that gives consumers a sense of trust that the way that the company is dealing with its data is on par. So absolutely that is true additionally if you dead

Mess up the consequences are huge for under HIPAA. For instance. There are some cases where if you're intentionally or severely negligently violating privacy, or the way that you're handling dead for healthcare protected health information could actually result in criminal sanctions. And so there's a reason those Industries are seen as more secure. But at the same time my my concern is that that also has to be balanced genovation as much as we want to protect people. We also have to think about how do we design regulations so that we're not stifling small businesses and medium-sized businesses. How do we make sure that investors can still innovate? And so I think that's a tricky balance that we always have to think about when we advocate for privacy or other types of data regulations.

And some of your writings this is kind of neat. You you often talk about an offensive data strategy being on offense. Not one that offends people with analytics and predictions versus a defensive data strategy data protection governance and the like so when we get to that level of thinking about the product this level of product really now is dead part and parcel with a much higher level business strategy than absolutely and I think that goes back to what we're saying about Apple and Goldman Sachs is that that directly affected their product launch there were any deposit put millions of dollars into this Foster this really high stakes partnership yet. It resulted in something that was really problematic for that company. And so we have to think about how it came across the life cycle not just in the hands of you know, lawyers and compliance officers and governors who are doing really important jobs, but that has to connect to products that home.

Connect to the way that we're using data that has to connect to user experience and balance. You know, how do we protect data but also use it in responsible ways because that's ideally one of the goals of the day need to start. We ultimately are trying to improve people's lives. So if that doesn't go hand-in-hand for not thinking about how we safely use data, so it maintains trust then you're missing half of the picture and I will just that I didn't come up with those terms. It was in this excellent hbr article called. What's your data strategy where this idea of offensive and defensive data strategy incorporate that into my work but it just found again that if you're in this world, you have to work with the c-suite they have to get bought in and you have to work cross-functionally across the design cycle of your products and services to make sure that you're aware of that Balancing Act between protection and Innovation and user experience. That's okay. We'll put a link to the hbr article in the show notes and thank you for for doing your right citation Club.

Which would actually it fits so well how we're thinking about things. So for me and we talk about trust let's talk a little bit about accountability and my example the one that I find sort of I think about it. I get a little troubled by it. I'm not upset and I think it's probably correct that the u.s. Filed sanction that sanctions but criminal charges against China for the Equifax hack, but all of a sudden I don't want people sit back saying you see it was the Chinese vs. Didn't Equifax violate some of their accountability of them people by not preparing for attacks. What comes first the chicken or the egg here. Absolutely. I think that's such an excellent point. So I will first admit that I don't know too much about the politics behind the suit against Chinese government, but I think the premise of what you're saying makes a lot of sense is that if for not also holding up accountable those who are responsible for the breach are we really going to have change?

If you're only pointing fingers across potential International actors who potentially did a really bad thing, so I agree and I think that's why there's some legislation advocating for this idea that if you're responsible for a date of birth, which maybe like when it's extremely negligent just like as in in HIPAA the health care Privacy Act, maybe those criminal sanctions should also exist in other sectors and we're seeing some countries like China and South Korea have some of those more extreme penalties. And so I think I don't know if he doesn't like to go there but I think this question of making sure people are also responsible around having their data is a really good book. I think it's an interesting observation and something that really should be thought about and everyone should kind of think about that that if they were harsher sanctions to boards or in actions that were taken. They might change the way they do things. I don't believe we get enough people. I mean, there's the delete Uber and delete Facebook campaigns that run for a few weeks and then die off again wage.

I don't think we've seen people rebell yet. But but we strongly believe here at Georgian with our focus on trusted. It could really harm a company and the risks to companies are quite great. Can you chat a little bit about some of the fact that you've seen maybe infintech? Yeah. Absolutely. I think there's a there's a huge variety of risks for instance. There's the risk of a biased algorithm and I think that's what happened Goldman Sachs stories back to you regardless, if not, it was biased and there's also the perception of bias of that algorithm around exciting things like credit limits and things like that. There's a lot of other types of groups such as Iraq sharing data with the wrong individuals there some really interesting stats. I think the EU commission said that the primary reason people are getting fined right now is not due to third-party leads to breaches just because of Insider incidents employees or third-party vendors that had incorrect access to data or negligently scared that data with the wrong people and I think that is only going off.

Masturbate as we have things like the California consumer Privacy Act and gdpr where people can now request data and there are now lots and lots of different apps for people to request data phone companies. I think those mistakes are only going to increase exponentially more and more unintentional breaches will happen in that regard outside of that. I think the most classic thing was one of those large consulting firms that regularly advised Banks and financial services where they just misconfigured their cloud storage and so people could just accessing plaintext really sensitive data in like it's really shocking how much that still happens. I think people not understanding how to configure Cloud environments, correct, especially if you have many Cloud environments, especially if you have hybrid environments that's only going to increase as more and more people get into them. Just I just thought it was ironic that it was a it was a consulting firm that wage.

Advising on this actually themselves were making that same basic mistake. And so I would really advise I guess just to sum I would really advise like let's get the basics right to things like a green Cloud environments and Insider. Would account for a lot of lot of things as well interesting. So I like that and it's all kind of wraps into accountability and who's accountable for what and is a good as the more different piece Parts play in particular with crowd solutions that makes it a little harder but you can't forget that. Can you talk to me a little about your view then on and we'll kind of get a little more ethereal data. Ethics what your thoughts are around that yeah. Absolutely. It's a very tough in Broad area, but I think it's it's a future of where a lot of these conversations will have to go home because my perspective is that simply just doing what's legal is not going to establish trust and somebody times again, but I think that's the clearest example you might have done that.

Thing legal. It's very likely that they didn't even have gender bias. But you still lost trust you do checked all the boxes and how do you make sure that doesn't happen to you? And I think you have to go beyond what the laws requiring you to do. And I think there are a couple of things that you can put into your toolbox to understand where to go. The first is where's the world going? We can learn a lot from gdpr in some ways their lives ahead of us around thinking about data regulations more broadly for instance measures of accountability of algorithms. That's a big part of that. And that's something that we're missing. A lot of the US so life is about not just doing what the law tells you to do. But also doing what you think will protect users and power them in the long-term Beyond rules, which I have a limit are just general principles that go across from things like academic studies around ethics to bioethics such as a pretty established tradition to various ethics codes that lots of companies are putting out my life.

Bioethics because it has an established tradition and it really centers on a few Central points magnificence, you know creating benefit nonmaleficence not creating all those are really obvious Justice and autonomy. And I think those are really those are really just big picture items that are current data regulations may or may not fall into but that's been something that's been hashed out in the medical community. Especially after lots of really terrible incidents like the Tuskegee Institute incident where a government agency was trying to develop some Innovative vaccines off or syphilis but in doing so without having good ethics codes and accountable ethics codes, they ended up causing harm to a vulnerable Community African-Americans of doing so and I think those checks of things we have to keep in mind I'm exciting as it is to create these new Innovations. Who are we leaving behind who we potentially harming and to make sure that that doesn't go down in the history books. The last component of the research is sort of critiques of ethics and I think that's wrong.

Weren't as this field of wolves are in some of those projects include for instance, but ethics is unaccountable, especially when you see this a lot of corporate statements where people are creating ethics codes because they don't want to do what people call ethics wash like greenwashing like show that your ethics but are you really being ethical? Like what's the proof? Are you actually going back to your point about accountability like accountability to the suffix? So that's one of the big critiques the second is that because of the nature of Academia and let his conversations ethics might be interpreted and eurocentric or eccentric or you know, very eccentric ways that exclude other viewpoints and then the last of the critique are just more again going back to the the Tuskegee Institute who are the people that were leaving behind who gets to make the decisions around what to prioritize on these questions of power and inequality. It's becoming a more vocal criticism around ethics conversation. And one thing I will encourage everyone to Thursday.

Nails 2019 report where they review lots of different ethical risk factors inequality risk factors that come out of artificial intelligence. I like this. I also like thinking it's not a check list GM ethical check Essex washing is got us at the latest thing was greenwashing and now it's ethics Washington. You're right. If a charts coming up to the board and you know, it's it's changed the strategy data ethics our privacy our trust. I probably should be called trust if it's just a bunch of checkmarks. We failed that there's not a narrative if there's not examples if there's not even Roi back to the business in terms of there could be a downside risk of losing customers that could be a downside risk. If we get to the point that you spoken about in terms of lawsuits and thoughts and penalties. It's it's got to be bigger. This is great. Let me step back a little bit and we'll talk a little bit about something a little broader data stewardship. Yeah, and yep.

Think of a doubt in terms of three elements people product as well as kind of tack a little bit through that a bit. Yeah. Absolutely. This is just a framework around change that is bring in a lot of different organizations, but people just the idea that what do people need like what are they scared about? What are they currently working on because change all too easily and you know, just give the larger perspective in context. Most people aren't doing data strategy in the comprehensive ways that we're talking about or even like even more so did the ethics a lot of this will require change and we were just talking about Ideal World. But so we also need a theory we have changed. Like, how are we going to start equipping people to make change in their organizations in an effective Manner? And so this is expired. This this framework exists is to help quit people to make these changes into organizations, but you have to give the people contacts from you've got to get context of how to do their jobs and that's gotta roll down for Thursday.

Top 100% agree. And so one of the classic ways for people to change and this is why you see startups succeed as they always start with paying points, like what are people struggling with like, what do they really need? What are those barriers that they're facing and how can your data strategy potentially help solve some of those problems? You're not going to find overlap with everything but getting a really good context or understanding of what people need is critical and actually see that my work constantly, which is that when organizations have both the stakeholder from the business side, I guess revenue-generating functions or analytical functions partnering with the government side. We often see that those organizations tend to get further in their data strategy efforts. The second is just process now that you've understood what people care about and what they're scared about. What do they need? How do you start building mechanisms to keep people accountable to keep people in sync and a lot of that involves cross-functional teaming wage?

That people can align their efforts to build those coalitions those diverse coalitions within and start building that group that's really going to start happening for change start experimenting with different prototypes around ethical data strategy or just general data strategy generally and prove out the value of what they're asking for when you say cross-functional who's on the team absolutely so not just on the governance side. So just like legal risk and compliance. But also what the analytics side also with ideally even marketing or product getting these offers a special teams are lines across lots of different views of the business is really critical cuz it goes back to what we're saying before is that all this stuff can't just be what one group off once it has to be enter woven throughout the process so that we can better develop that business value because if it's ultimately just isolated within quota quota cost center. It's going to be really hard to prove that that strategic value wage.

And then lastly technology is just this idea of okay given that we have these different experiments new workflows. They're created processes that created how can technology be used responsibly and effectively two months make people's lives easier. So there's this idea of data protection by default what our tools that can start adopting so that it's easy to protect data by the van you only get access to data when you're supposed to talk about how we make all these troubles and things run in the background. So the internal user experience like the user experiences of the data scientist or others their life isn't troubled but there's they can still do or Jack nearly troubled but they can still do the right thing and understand that this is for the greater good of company. I like that when you went through your three parts of data stewardship Tech is the last as opposed to the first one. So we did a little bit of a downsize a limited the negatives. Let's end this on a positive note. And so tell me what some good things that can happen you talk to me if customers are going to be more long.

Or share more data with you help me understand kind of the upsides of getting all this right. Yeah. Absolutely. I think there's a few things. The first is that it's been opportunity for the c-suite to take stock of what they're doing and refined their purpose one of the key components of data regulations Society of purpose based restrictions that you use data reasonably related to what the collective chords really to attack this idea that you're just a deal for data according you just collecting lots of data without understand we're going to do with it. And then that you also use it for purposes that are reasonably related to people's expectations. This is an opportunity for people to be proactive around. What are they want the state of for like what are sort of the analytical strategies that they want? What are some of the experiments that they want to do and in bed that proactively and that that off within their data collection strategy within their data storage strategy within their analytical strategy so that they can move forward with their goals. And so if anything I believe that this is an opportunity for people to be more disciplined dead.

Around thinking about what they want to do and not go in a reactive mode where you're just supporting data or don't really have a clear sense of what you want to use the data for beyond that with that purpose at that mission. I think lots of positive things will follow when you realize that you don't need to collect all this data and you increase the trust that you're not scaring customers by collecting Social Security Office, you know numbers unnecessarily customers might want to share more data with like, oh actually this this is a pretty painless process and this is something that Ally bank's Chief marketing officer really talks about she's real life public even as a marketing officer saying that in the financial space consumer space, there's a problem where we're collecting too much data and that's harming the user experience. People are questioning. Why do you need the state of we're going to use it for if you collect less of it and you just ask for Less higher quality data that matches your purpose people will share that higher-quality data with you, especially when they know Thursday.

It's giving them immediate benefit and a matching your purpose then actually ties right back to transparency hundred percent hundred percent, especially when you can on notify the user like sucking this because one increase your user experience and give you this immediate benefit people will more likely happily give you the data cuz you fell through what that user interaction is going to be the 2nd instead of indignation want to contrast this with like instead of relying on a secondary data sources that are making inferences about users. That might be more noisy that might be full of biased why not get higher quality data from people directly, especially if your business model can support it so that you can get that higher-quality day then create better analytics in the long run. And then lastly it's just this idea that

You will have more resilient companies. I really want to point out this Harvard Business Review study where professors analyze and compared companies that had strong data protection policies for instance better protecting data by default allowing people to access their data when there was a breach to a comparable company those companies that had these protections off their stock didn't drop by 150% as much compared to those that had poor policies. And so I just thought that was a really interesting stat that they identified and kind of just shows that if you have these data protection policies and it's very likely that we're living in a world where data incidents are inevitably going to happen. Why does will protect yourself have these Protections in place? So you don't suffer these huge gaps that will be really important for the long-term resilience of your company.

Whilst I've got customers happy that they're sharing data. They feel good about the data. They're sharing with you customers are more loyal because of that and companies have a lot less downside risk off. Those are some pretty good positive stories. So I'm excited about that. So Dan moo. Thank you so much for taking the time to chat with us today. Yeah. I know. This is an exciting conversation. Thank you so much for sparking this and leading leading-edge. This really excited to continue the conversation.

DESCRIPTION

A solid data strategy can prevent your company from running aground and turning a huge opportunity into a horrible mess.

Dan Wu is our guest on this episode of the Georgian Impact Podcast. Dan is a superstar commentator in the privacy and data governance space. He’s leveraging his Ph.D. in Sociology and Social Policy and his law degree to help protect people and their data. Dan believes that the best way to do that is through data strategies formed by cross-functional teams that include input from governance, analytics, marketing and product departments.

You’ll hear about:

  • What we can learn from the botched launch of the Apple Credit Card
  • Why every company needs a data strategy
  • How regulation, like the Algorithmic Transparency Act, could add protections for consumers and accountability for business
  • Offensive vs. defensive data strategy – HBR Article
  • Where responsibility for inaction leading to data breaches should lie
  • Data risks businesses face, including biased algorithms, sharing data with the wrong people, 3rd party data breaches, insider incidents, and technical mistakes
  • Why data ethics need to go beyond what’s strictly legal in order to establish and maintain trust.
  • AI Now’s 2019 report that touches on ethical inequality risk factors in AI

 

Who is Dan Wu?

Dan Wu is the Privacy Counsel & Legal Engineer at Immuta, a leading automated data governance platform for analytics. He writes about purposeful data strategy on TechCrunch and LinkedIn. He holds a J.D. & Ph.D. from Harvard University.