Creating a Privacy Culture with Spotify's Vivian Byrwa
John Prial: We have a deep topic today that really needs no introduction, privacy. What you really need to know about privacy? How about thinking about your corporate culture and structure? How about how it relates to transparency and ethics? How about what might happen during M& A activity? This is a really broad topic and you're in for a really engaging episode today. With us today is Vivian Byrwa. Vivian is the Legal Counsel for Privacy at Spotify, but we aren't going to just talk about Spotify. Believe it or not, that's just too limiting. You see, Vivian has had close to a decade of focused work on privacy and its implications for companies. And we're very fortunate to be able to spend some time with her today and talk about this broad range of topics. I'm John Prial, and welcome to the Georgian Impact Podcast. Why don't we start with you giving us your thoughts on what you mean when you talk about privacy as culture? I think that's really an interesting way to set it up.
Vivian Byrwa: Sure. So, the way I think of it is, when we think of culture in the societal sense, it's the way things are done by a group of people and their collective attitudes. It's seen in their attire, their cuisine, the tradition, and passed on from generation to generation through learning and emulation. So in the same vein, when I think of the privacy culture of a company, it's the way that a company does things as a whole, depending on their collective attitude towards privacy and data protection. Whether or not privacy is a priority for a company can be seen in the company's mission, their internal processes, external messaging, their products and services, and what that looks like and of course it's passed on from the long- tenured employees to new hires through, again, learning and emulation.
John Prial: So the worst thing would be privacy as an overlay in a corporation where someone's running around and poking and inspecting. You really need this infused into people's day- to- day activities and operations.
Vivian Byrwa: Exactly, exactly.
John Prial: Very cool. Very cool. Now, when we think about corporate culture, there are lots of consumer companies, including Spotify, that talk about consumers. How do you see the difference in terms of what the perceptions are inside the corporate level, and how it manifests itself outside of the end user side of things?
Vivian Byrwa: I think the public perception of a company's values really comes from their message and then how it matches up with the actions that impact their consumers. So certain companies will tell how much they care about privacy, and yet when you look more closely at their product or their disclosures or both, it's not true. As a consumer I'd rather have the company do and say the same thing even if that thing isn't really desirable, because then at least I know what I'm getting myself into. So the more your actions deviate from your message, you run the risk of consumers perceiving your privacy culture negatively. But of course, not all consumers know how to look for the company's privacy culture in their public persona. I mean, my husband, for example, would have zero clue. He's the type of consumer that would check yes to everything and not read anything at all. But I think unlike him, consumers nowadays are becoming smarter and more aware of privacy and I promised you some statistics, so here goes.
John Prial: Great.
Vivian Byrwa: In a recent June Forbes article, actually, which I can provide you the link and you can link it to the show notes, there's a lot of great statistics in this Forbes article that links to various studies, and they reported in one study that 79% of Americans are concerned about their data and how it's being used by companies and 92%, which is really high, are concerned about their privacy when they use the internet. So a company's privacy culture is becoming more and more important to consumers.
John Prial: That's fascinating. Do you think there's a difference when people are thinking about their data, their information to a company they get a service from, versus let's say, governments or banks? Or do you think there's a difference in how those numbers might play out?
Vivian Byrwa: I suspect there would be because I think banks from a consumer perspective seem a bit more, maybe trustworthy and buttoned up. At least that's I think the public perception, whereas a consumer company, they are more focused on R&D and providing great products. They might not think to put privacy first when creating the product. It might be an afterthought. I think consumers are starting to realize that.
John Prial: So of course, consumers are the ultimate judge of using your product. So give me a meaningful way of kind of getting that. That's interesting.
Vivian Byrwa: Yeah. So from my experience, consumers can't wait to tell you what's wrong. Maybe they won't tell you what's working right, but they can't wait to tell you what's wrong. So for privacy in product teams, I'd say become best friends with your customer service team. They are on the front lines. They hear it all. Train them like mini privacy pros that know how to issue spot and when to escalate. I have a lot of respects for the CS folks. They take all kinds of abuse. So one time I had a company that I counsel, had one consumer write in pages and pages to customer service, absolutely convinced that the company had created some secret tool using their data that allows advertisers to insert subliminal brainwashing messages that were not physically detectable. They were absolutely convinced. It was pages and pages of back and forth with customer service. It's unbelievable some of the things that CS encounters that you just never know about. So my view is you should talk to your CS very often and see what the consumers are saying. What's working, what's not working. What are they concerned with? That's really your insight into the consumer's mind.
John Prial: I love that because I've often thought about CS as a source of product requirements, but this is actually a bigger thought with perception. This is a macro level thought coming across from the CS team. I think that's great. Thank you for that. Now what about laws? For example, the CCPA, California Consumer Protection Act. How would that affect how companies behave, and do you think the consumers perceive it as a good thing and do the companies perceive it as a good thing or just more work to do?
Vivian Byrwa: I think consumers definitely think of it as a good thing. Companies, well, I think how companies approach the implementation of privacy laws, like CCPA, really depends on their existing culture. So if you take a company with a less privacy- focused culture, they would probably see CCPA or GDPR as a hindrance, or just another problem to put a BAND- AID over and hope it goes away. I mean, I think we all know that privacy is not going away anytime soon, but I think new privacy laws could be a turning point for the company's culture, sort of like an opportunity to change an existing culture to a better one if it's not very buttoned up right now, maybe one with more appropriate guardrails.
John Prial: Interesting. So let's talk about guardrails, but this is the... and you've got a LinkedIn post that we'll also link to in the show notes. We talked a bit and we started talking about culture. Just touched a bit of laws, but that's not necessarily a corporate policy, but I want to understand your thoughts about policy and tech, and how those three guardrails, those three different legs of the stool, contrast with each other.
Vivian Byrwa: Yeah. So I think the traditional privacy programs usually consist of two types of guardrails. The policy and technical, both of which are largely driven by executable requirements, mostly set forth in laws, regulations, or even industry guidelines. When I think of policy guardrails, I mean those written policies and procedures within the company that govern how a company should handle personal data like policies on retention, destruction, when to initiate and complete a privacy risk assessment, encryption, ways to encrypt, pseudonymize, things like that and of course your external privacy notice to consumers. Those are all part of the policy guardrail bucket. Then on the other hand, we have technical guardrails, they're the technical and security measures that are physically in place to protect personal data. They're often implemented to supplement the policy errors or implement the policy guardrails. But in my view, these two guardrails are not sufficient anymore. We need to start looking beyond the traditional practices. If we have policies and procedures written, but no one bothers to read them or follow them, or people are finding ways to try to circumvent the technical guardrails, then you might as well not have them. So I think having cultural guardrails actually helps protect the policy and technical guardrails.
John Prial: Corporate culture is critical, but that's a bit fenced in, right? What happens when companies come together? I mean, we do live in a world of startups and investments and acquisitions.
Vivian Byrwa: So I once worked on the M and A deal where during the diligence process, it was really apparent that there was no privacy synergies at all between the buyer and the target whatsoever. The buyer cared greatly and the target couldn't care less. I mean, they were a startup, it's probably their attitude to just create the best product ever and that's it. The deal went through fine however, during the integration phase, it became immensely difficult for the target to change their attitude towards privacy. It almost became impossible to try to implement changes that the parent company wanted because of the pushback from the target's original team. So if you're involved in M and A, you have to look for privacy synergies early on. I think it's more important than you think, and would definitely affect your ability to build cultural guardrails for the acquired company later on.
John Prial: That's a great story and I really hadn't thought about it, but if you don't get off on the right foot around privacy, you could be six months, a year down the line and now we're talking about retrofitting products for privacy, even security, and one mistake, your trust is lost and there won't be a successful acquisition. That's a great story. I appreciate that Vivian, that's terrific. So I think about educating the teams on privacy, and as I'm thinking about the evolution of tech and the evolution of the apps, we're never going to reach a stasis that a company says," Okay, today we did it. Everybody's good on privacy." You mentioned early on about the senior people training the new employees. This feels like it's as important as any other element of a product evolution and is almost like a company evolution to stay on top of what's coming from privacy because it shouldn't just be viewed as product. Is that fair? Is that too broad a statement?
Vivian Byrwa: No, I think that's really spot on. Creating cultural guardrails isn't as straightforward as like a lawyer drafting policies, we're an engineer deploying code. It requires everyone's mindset being focused on a pro- consumer viewpoint and requires a balanced inclusion of less concrete, less measurable concepts like ethics and transparency and privacy by design into everything the company does by everyone. It's not just a problem for the privacy team anymore. It moves you beyond the mere tick box compliance towards a way of life, a way of company life that is more pro- consumer and more ethically driven.
John Prial: So forgive this question going to a lawyer, in terms of the challenges of getting executive buy- in and I love that you mentioned privacy by design, and we've had Ann on our podcast before, and she's spoken at our events, she's phenomenal, but I want to think about this privacy first mindset and making it understandable versus legal mumbo jumbo, meaning that's how you start at the top and get the executives to buy in and let this [inaudible 00:11:32]. How do we think about translating the bits and bytes of technical thoughts into a narrative or a story about privacy? What's the best way to accomplish that?
Vivian Byrwa: So there's really two things to unpack in your question really, the executive buy- in piece, and then also ties in with how to just generally get people to understand the legalese of it all. For the executive buy- in piece, thankfully the few times I've had to get executive buy- in on a more pro- privacy stance wasn't as challenging as it could have been, but I think what's difficult for some executives is that privacy often is part of legal or compliance or both, and those are cost centers for the company. It's so easy to just prioritize product, development and chase innovation because they're the obvious moneymakers here. But I think companies have a real opportunity right now to do the right thing in the privacy space and create a new type of business value and getting the right executives to recognize that could go a long way of setting the tone for that pro- privacy culture you want to build. So my way of trying to get executive buy wins really try to recognize that it has a lot of consumer value, which translates to business values. So good data protection practices, for example, gain trust and translates to consumer retention. The average consumer isn't joining platforms because of their great privacy positions people joined because they liked the product. So say hypothetically, very hypothetically, you have this amazing audio streaming service where there's a huge catalog of music and exclusive podcasts and their privacy practices are great, but people are signing up, not because of anything else, but product. Sure, privacy isn't going to be the main factor that's driving people to your business and your products, but having privacy practices that are good can be a main factor to keep your consumers. That's the business value that's additional. So to address the legalese and how to parse out that piece of your question, I mean, I think it can definitely be hard. Even explaining what personal data really means in the legal sense can be different because it's not always what you expect it to mean. I find that asking the right questions is one way we can really make legalese more understandable. Sometimes the questions don't have to be legal at all. Questions like what choices does the consumer have with respect to this new feature? Or what does the consumer experience look like? What's their journey like? Show me where did they click? What does the screen look like from the consumer's perspective? Those are easy enough questions to understand and respond to buy a product person that has no legal background and yet they get at privacy principles without using privacy legal terms. So if you're a privacy counsel, my advice would be to be more inquisitive and give relatable explanations of privacy instead of regurgitating what the law says, which I mean, a lot of lawyers do fall into that, myself included. I have to constantly remind myself to try to use examples and comparisons relevant to the audience industry. If you're talking to an ad tech professional, talk about advertising with them. If you're talking to a music licensing, talk about label examples so they can understand specifically how privacy applies to their business unit.
John Prial: Making it relatable is great and you mentioned retention, that's a clear, measurable item. If we can link back to some of these actions on retention, that's a good thing. It's funny, I think it was just last night, I went on to Netflix and the screen blanked out and said," We have new terms and conditions click here if you're okay with that." I actually clicked here to email them to me. Now, the odds of me reading them, being like your husband and just saying," Okay." I don't know yet, but I'm sure that sitting in my pile of email right now are these new Netflix terms and conditions. But it is interesting that somewhere along the way, I do get frustrated on a lot of things, say," I'm done with you," kind of thing. Normally it's bad customer service. I think we could get to the point where people could say," No, I'm done with you. I'm going to find another way to... there's enough competition out there." So it'll be interesting to see if this consumer... I liked the thought of retention maybe as the key measurement and I bet there'll be other sources of consumer feedback that could be gathered to see the perception of a company, the perception of a company's brand, right?
Vivian Byrwa: Yeah, So in that same Forbes article I mentioned earlier, it said that 70% of organizations say they receive significant business benefits from privacy. Whether it be because of operational efficiency being set up automatically, or agility, or innovation, and that every$ 1 spent on privacy, the average company received back in benefits about$ 2. 70 cents.
John Prial: Wow. That's pretty darn good. That's outstanding. So you want to have a privacy team that does the leading edge thinking, my sense is you've got to look throughout the entire organization and find your champions. How do you go about doing that?
Vivian Byrwa: When you set out to create a pro- privacy culture, you aim to drive transformative change at scale, but it's hard to do that without privacy champions and with just your team alone. So I like to think of them as internal cheerleaders for your group. I would suggest recruiting colleagues who are enthusiastic or curious about privacy to be your champions, particularly in the business functions that are most likely to be impacted by the processing of personal data. They know their business functions the best, and they will be able to advocate for the need to embed privacy into their day- to- day work. But with that said, I don't think privacy champion networks always work the way you want it to. I think that you would need to be properly trained. They have to have clear instructions on what to spot, what their role is and how to interact with the privacy team itself and most importantly, they should want to be involved, they can't just simply be appointed. I've had experiences in the past where companies want to appoint one in each business unit and it falls apart. I think it's because these people are not traditionally trained to be privacy pros, they don't know what they're looking for, but also because if it's not really written into their work objectives... Now let's face it, most people don't want to do extra work or volunteer for extra work if it's not to their benefit. So I think it's very important that these people should actually want to be involved. So there was this one guy at work, an engineer, who actively pursued me and put multiple meetings on my calendar to walk me through his plan to upgrade of some part of our encryption system. Someone like that, who's really curious, who don't mind volunteering. That is somebody who you should ask," Hey, do you want to be a privacy champion for us?"
John Prial: Nice.
Vivian Byrwa: Yeah, instead of just being like," Okay, we need to have X, Y, Z people appointed in each group." I don't think that's the way to go.
John Prial: You got to find the match. So just to close in this section, I'm thinking about terms and conditions and your husband and my Netflix thing. I have this vision of a perfect terms and condition being effectively an infographic that's simple and easy to read. Do you think companies should be moving towards that path? I think the issue is, of course it's got to be shorter and of course the more legal you put into it, the longer it gets. So we ended up with 30 page Ts and Cs. What's the right mix in terms of everything needs to be written down or everything is so simply consumable? How do you find the balance in that?
John Prial: He might even click one or two, Learn More.
Vivian Byrwa: He might even click on those, yeah.
John Prial: Yeah. I think it's interesting. You're right, I like that that shortens it, yet it's all there, because if you gave me an eighth grade version of War And Peace, I don't think I could still get through it. But if you gave me a Cliff Notes, War And Peace, that allowed me to expand it along the way, that would be kind of neat. What we haven't touched on is ethics. Although we've talked a lot about privacy and a corporation think about it, it consumers, what do you see as the intersection of ethics and privacy? Is that the same? Are they different for you?
Vivian Byrwa: So I think they're different, but they're related. So I think the intersection at which ethics and privacy meet is when the conversation changes from, are we compliant? To are we doing the right thing? The objective of the company's privacy programs should not be to keep off the radar of regulators. I mean, it shouldn't just be about that, but also to underline that key corporate value that can reassure consumers and partners, that their trust is well- placed because after all, without trust companies run the risk of losing their consumers like we discussed and falling behind their competitors. So companies should really go beyond doing the bare minimum of the law and what it says, and just take that one step further and do right by the consumers and use non- legal standards because we're talking about ethics here. It goes beyond what is just minimally required by legal standards. So there are lots of non- legal standards we can use, like for example, privacy by design, there's also Nissembaum's contextual integrity, or Calo's subjective, objective dichotomy, and use all of that when you're evaluating products and services. My view is that privacy and ethics isn't just the responsibility of the privacy team anymore. It's everyone's responsibility and no matter what role you have in the company, isn't it also your responsibility to do the right thing and do the ethical thing?
John Prial: It's interesting, if I grabbed any random CEO off the street, not that I see that many and I asked if ethics or privacy drove their thinking, they would probably say ethics first, because there might be a vision of how the company is. The intersection makes it actually really more powerful that I hadn't really thought about it the way you represented it, that bringing the two together and showing that ethics is almost a step beyond privacy, but you got to forget that it's a foundational piece, or you're never going to be ethical if you don't have the privacy stuff. I have to think more about how those two pieces go together, that's fascinating. Let me take you to a little bit of trust as a topic. The way I think about this a little bit is we're going to be putting parental controls in and of course the company has got to establish a relationship of trust with that parent, that they're not going to make a mistake and that they're going to adhere to what the parents are looking for in terms of what they provide access to the children for. How does that foundation of trust get built?
Vivian Byrwa: Well, when it comes to kids, I think the stakes are even higher, when it comes to ethics and privacy. My view is that doing the right thing should always be the baseline standard when it comes to kids. Kids are more and more online these days, especially with COVID. So parents need to have trust and peace of mind when their kids are using online products, which they naturally gravitate towards. Parents need to have as much choice and control as practicable, whether it be mandated by law or not. So yes, we have laws like CAPA that help with parental consent and controls for companies with a young audience. But personally, I think we don't have the luxury of just doing, are we compliant, so to speak. We have to ask the question, are we doing the right thing? Because that's way more important when it comes to protecting our children. In terms of parallels for the corporation, the companies that work with great amounts of data comes with great responsibility. So people who are entrusting you with our data, my view is that we should give them the courtesy of transparency, give them as much choice and control as we can just as we would with parents that have children. Because if it's within your power to give, you should always try to give it and always be ethical and do the right thing. No matter if you're a privacy pro or not, you should always be asking the question, how can you do right by your consumers?
John Prial: So Vivian, this was a great discussion. Thank you so much for taking the time to be with us.
Vivian Byrwa: Happy to be here.
John Prial: With great data comes great responsibility, I am not going to make the pop culture link here and whether or not you get the connection, this statement stands alone. Thanks Vivian. For the Georgian Impact Podcast, I'm John Prial.
Leading on privacy means more than compliance and technical solutions. To excel, companies should also foster a privacy culture.
Vivian Byrwa, our guest on this episode of the Georgian Impact Podcast, discusses how cultural guardrails can reinforce policy and technical guardrails. Vivian is the privacy counsel at Spotify and has close to a decade of experience focusing on privacy and its implications for companies.
You’ll Hear About:
- Privacy culture and how internal attitudes can affect data protection and privacy.
- The importance of policy and technical guardrails but also the need for cultural guardrails.
- The difference between being compliant and doing the right thing, when privacy and ethics intersect.
- 50 Stats Showing Why Companies Need to Prioritize Privacy
- The Modern Privacy Program Needs Cultural Guardrails
Who is Vivian Byrwa?
Vivian Byrwa has spent nearly a decade focused on privacy and how it affects companies. Her focus on privacy began while working at Herrick, Feinstein LLP and later at Davis & Gilbert LLP. She joined Spotify in 2019 as the first and only U.S. based privacy counsel and helps develop and maintain Spotify’s global privacy program with a focus on advertising and marketing, creator related issues and student data.