Episode 110: Data Trust by Design with Nathan Kinch

Media Thumbnail
00:00
00:00
1x
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Episode 110: Data Trust by Design with Nathan Kinch. The summary for this episode is: <p><span style="font-weight: 400;">On previous episodes of the Impact Podcast, we’ve talked about</span> <a href= "https://georgianpartners.com/episode-81-using-privacy-to-differentiate-your-business-with-ann-cavoukian/"> <span style="font-weight: 400;">Privacy by Design with Ann Cavoukian</span></a> <span style="font-weight: 400;">and</span> <a href= "https://georgianpartners.com/episode-107-information-privacy-for-an-information-age/"> <span style="font-weight: 400;">Privacy as Trust with Ari Ezra Waldman</span></a><span style="font-weight: 400;">. In this episode, we complete the trilogy by discussing Trust by Design with Nathan Kinch. </span></p> <p><span style="font-weight: 400;">Trust can be a difficult topic to pin down and make actionable. Nathan and Jon discuss how Nathan has built a toolkit called Data Trust by Design to provide a systematic method through which organizations can design for trust and actually be worthy of it. </span></p> <p><strong>You'll Hear About:</strong></p> <ul> <li>Why customers are willing to share data when they trust the company and product</li> <li>How you can sell more on a positive message of trust</li> <li>What metrics trust can impact</li> <li>How brands with trust outperform the competition</li> </ul> <p> </p> <p><strong>Who is Nathan Kinch?</strong></p> <p><span style="font-weight: 400;">Nathan Kinch is founder and CEO at</span> <a href= "https://www.greaterthanexperience.design/"><span style= "font-weight: 400;">Greater Than X</span></a><span style= "font-weight: 400;">, working at the forefront of the personal information economy.</span></p> <p><span style="font-weight: 400;">Greater than X helps brands to evolve their culture, workflows and practices to systematically release verifiably trustworthy, data-enabled products and services to their customers.</span></p>

Jon Prial: I'm Jon Prial welcome to the Georgian Impact Podcast. Today, data, trust. Data Trust by design. Here's Nathan Kinch, CEO of GreaterThanExperience. design AKA Greater Than X. Talking about the evolution of his company.

Nathan Kinch: Greater Than X was founded two years ago off the back of work that I'd been doing in the personal information economy and with personal information management services for quite some time. We started developing effectively a toolkit called Data Trust by Design. A systematic method through which organizations could design for trust and actually be worthy of it importantly. Two years ago, Greater Than X was effectively an experiment. We gave ourselves 12 months, like," Does this thing have commercial validity?" We did not know. You never know till you put it to the test. And here we are two years later. We're working with the consumer data standards work here in Australia with the federal government, the Open Banking implementation entity, big financial services organizations all around the world, Data Transparency Lab. We've had a really good run. We're into at the start of our third year now, so it's starting to get very exciting.

Jon Prial: We're glad to have you with us. Welcome Nathan.

Nathan Kinch: Thank you so much for having me. Really excited about this.

Jon Prial: So we've talked to Ann kovoki about Privacy by Design and then in a follow- on podcast, we spoke with professor Ari Waldman and he talked about Privacy as Trust. It's very clear these topics are never going to end, but I'm viewing this as the capstone of a trilogy now. So please talk to me a little bit about trust by design.

Nathan Kinch: So that's a really nice segue. So, Ann's work and Ann's an advisor of ours and she's just been so awesome for this movement globally. So whenever I get the opportunity to do a little bit of a shout- out, Ann if you're listening, hi hope you're well. And the same goes for, I think Ari's work is absolutely fantastic. When I first read Privacy as Trust, first thing I did when I finished the book was send an email," Hey, Ari, we need to chat." So very happy to be in such esteemed company. Now, this is actually a really interesting question. So Privacy by Design, which is really now referred to as Privacy and Security by Design more frequently. It's a principles- based approach to proactively designing privacy enhancing, security enhancing products, services, workflows, operational processes, business models. The works. Data Trust by Design was born out of a deep frustration that customer experiences, citizen experiences, patient experiences, as it relates to information sharing and the way that we as individuals or groups participate in the information economy, which is fundamentally broken. Terms and conditions are often 33, 000 words long and they read at grade 16 readability, which means they're incomprehensible by design. So often, organizations deploy what are known as dark UX patterns, which are effectively manipulative or coercive. There are different ways to frame it. Techniques that sort of drive specific actions. And so often these are applied to get us to share more of our information without really being aware that we're doing so, and the net result is that there's this fundamental power imbalance in the information economy between individuals or what you might call data subjects. I'd rather just call them people and organizations, and those could be governments or private businesses. And what we figured was that all right, well, if there's a bunch of assumptions that we started off with... Well, if there's a relationship between trust and data sharing, then we need to understand that more. And if there is, then we need to figure out, well, how do we help organizations showcase that they are trustworthy and how do we use that to an organization's competitive advantage so that when individuals are asked to share their data, they understand what they're getting into? It doesn't take them years to understand it, and actually they have a really high propensity to willingly share because there's been almost verifiable trust established throughout that process.

Jon Prial: Let me ask you a different question. Let me spin this a little bit, because we live in the world of, I'm not sure it's fair to call it fear- mongering but it's always the negative stories. So you're selling yourselves, you're a small consultancy, you've got to get your foot in the door. Can you get your foot in the door with a positive case study versus saying," You don't want to be like that person? Let me tell you what kind of person you want to be and how you can get there." How do you sell this as a positive?

Nathan Kinch: That's such a good question. So our chief platform officer and I, Matt, have this discussion quite regularly and we share a view that there is a heap of dystopia out there, and actually it's really refined. The dystopic perspectives are really granular and compelling. There aren't many that counteract that. And so actually we did a piece of work recently with Data61, which is a sort of function of CSRO, which is the Australian government's science and innovation unit on consumer data standards and consumer data right and consumer data standards here in Australia are the underpinning of what we believe the future of the information economy will be where individuals are able to more actively share information in a more trusted and trustworthy manner. So it's really exciting. And as part of that work, we actually did end up presenting a vision for the future of the Australian digital economy. And I wouldn't say it was a complete utopia, but it certainly was trying to push that positive message. Now, I think that probably, if you were to look at an 80/ 20 rule, 80% of our rhetoric is focused on positivity, 20% on stuff that really requires attention. I wouldn't even call it negativity because we still try and frame it constructively. The reason that we do that is if you just jump on a new site today, there's so many data breaches, there's so many organizations that are doing the wrong thing and not necessarily intentionally, right? There's a lack of maturity and understanding and infrastructural, there's different types of things. We feel like we need a breath of fresh air and what we are seeing, and we have proxy metrics, and then we have some fairly strong metrics and revenue growth, time to contract these types of things from our organization standpoint, really good, positive indications that we are making progress. And actually, the more positive message about how organizations can inform, empower their customers in such a way that it actually benefits their business, their shareholders, their stakeholders. So far so good, but we have a long way to go.

Jon Prial: And I'm going to play a bit of devil's advocate. So without a doubt, those type of measures are important. I think it's really important that sometimes though companies, particularly larger companies let the measurements drive everything to the detriment of the higher level of thought. And I just say, we're not going to do negativity. Wells Fargo broke everything because they had a compensation system that actually forced people, forced their sales reps to behave badly and open up false accounts. And that came down from the top. So you're right. Revenue growth matters. When you're sitting there with a CEO, for example, how do you overlay the ethical thought and make that also fit back to a revenue growth story?

Nathan Kinch: Yeah, so that's a really fantastic question. So Wells Fargo is not unique. Unfortunately. I wish they were. I wish they were. The Austrialian financial services sector got in a bit of strife recently, and there was an independent Royal Commission 700 odd pages, 76 recommendations came out of that, probably wasn't anywhere near as scathing as it should have been if you're trying to be fairly objective. But one of the things that came out of it rather clearly was that these organizations were designing for and actively rewarding misconduct. We actually ended up taking some of our work and applying it to that context and wrote a play called Rebuilding Trust in Financial Services. And in it, we call out a specific mechanistic approach to enable organizations to redesign their incentive structures in direct alignment to the outcomes that their customers care about most. The sort of intent that we had there was to showcase that it is possible to first and foremost, lead as an organization in terms of your perspective, in terms of your moral philosophy, with the good that you are trying to create for individuals, society, and potentially even the planet, and utilize that driving force to create value for shareholders. Now, let me give you an Australian example here, but I believe it's super relevant in North America also. In that the 2001 Corporations Act here in Australia effectively defines the purpose of a corporation is to maximize shareholder value. And when you think about the position that directors are in, and often the CEO might be a director, it's tough because if there's a trade off decision to make and quite literally, your mandate is to maximize shareholder value. It's no surprise that crosstalk.

Jon Prial: You will get bad behavior and they'll forget the end users. Exactly.

Nathan Kinch: I know. And we don't think it has to be a zero sum game, but it is tough. We try as mechanistically as possible showcase the fairly tangible nature in which organizations can be ethical by design and be verifiably ethical and how those types of behaviors, that type of conduct can be rewarded in a variety of different ways, optimized unit economics, an extension of the products and services increased lifetime value, greater public confidence and sentiment, etc.

Jon Prial: Well, I think it's interesting the journey there is obviously, we're not going to take away shareholder value and that's measured often quarterly, that's a subject to debate. But then you've mentioned lifetime customer value as a critical measure. That's not quarterly. And what does it take to retain a customer? And so then churn shows up, but if you can begin to bring that lens of trust into an element of churn, have you been able to yet make that correlation or causality between the consumers that trust stick around and that's how you get your LTV up, perhaps.

Nathan Kinch: Yes. So first of all, there is a significant barrier in terms of transitioning away from quarterly thinking. And it's not because people aren't intellectually capable of doing so, they absolutely are. It's because if you think about it, right? So you've got the authorizing environments, you've got the board, right? Responsible for corporate governance, set strategy, identify and manage risk. CEO participates in that, but effectively, disseminates that message to the executive committee, executive committee disseminates that throughout the operational business functions, everyone's optimizing for the quarterly metrics. So it's kind of a systemic issue that's really challenging. Now, the distinction between a correlative impact and causal impact is really important. And we're actually collaborating with The Kellogg School of Management Trust Project at the moment to try and look more specifically at some of these causal influences, but it's really hard to do in an organizational context. My argument is that really organizations value outcomes. They don't necessarily value validated learning, and therefore the types of systematic experiments that we need to conduct in order to determine with a decent confidence interval, that something cause something else and we can control for confounding variables and stuff like that is it's really tough. So I would say that there is very strong correlation and there is lots of evidence to support this. And actually inaudible did a really good piece of work late last year on their competitive agility index and what they showcased quite clearly, and I think rather compellingly, was that trust disproportionately impacts bottom line business value, growth indicators, really tangible stuff like EBITDA, etc. Trust can be thought of rather simply as high confidence in the unknown. And it has played a pivotal role in the development of society, particularly modern societies. When you think of fiscal trade and stuff like that, modern commerce is like the ultimate trust use case. So not discounting the importance of trust, but when we think about the information economy and when we think about data trust and the trust that individuals place in an organization's information processing practices, we start going," Well, is high confidence in the unknown, actually, what we're optimizing for? Are we designing for trust or are we designing for something different?" And our argument is that actually, we're designing for what we would frame as verifiable trust, and that's effectively an independent well evidence, cryptographically and otherwise, and easily understood demonstration. That organizations number one, are transparent in communicating their intent. Number two, consistently delivering the value they promise and number three, willing to own the consequences, both positive and negative of their actions. And that's the focus that we have. We want to enable organizations to become worthy of trust and have the ability to verify that they're actually trustworthy, which is somewhat different to high confidence in the unknown. And what that means is if you're trying to determine whether or not trust positively impacts a particular business metrics like lifetime value, you have to quite granularly define what you mean by trust, or whether you're focusing on verifiable trust and all these different things.

Jon Prial: Cool. And an amazing answer, you touched on high confidence in the unknown, verifiable trust, as well as other different metrics. Here at Georgian Partners, we also have discussions over the continued evolution of these metrics of tracking trust. So what do you focus on?

Nathan Kinch: So there are three distinct metrics that we focus on in our Data Trust by Design toolkit most regularly. It's not that there aren't others, it's just that these are the three that are most immediate focus. The first is comprehension. As in do people actually understand, do they have the ability to accurately recall, etc, different ways through which we can measure that. But if you're interested, we can absolutely talk about it. Number two, time to comprehension, really, really important because if we achieve very high comprehension, like greater than 80% of people understand greater than 80% of the agreement that they're entering into. And therefore, we as an organization, that's a great result. We're really happy with that. We can evidence that brilliant, but it takes people an hour while we have no business. Unless it's maybe something like a home loan, right? Because there are very specific, like huge life events, stuff like that. But most of the stuff we're talking about is transactional. You're not crosstalk

Jon Prial: That make that very visible. I know on a home loan, what my APR is, my payback is, and I may not read the other 90% of it, but I'm going to walk away getting it, right? I will understand-

Nathan Kinch: With a good idea exactly, exactly, exactly. And lastly, propensity to willingly share. And when we put Data Trust by Design to the test versus a control, and typically like the control will be the existing way that an organization delivers this information. And that could be a privacy notice terms and conditions, a consent based data sharing experience, you name it, anything that relates to asking an individual to actively or passively, because it might be hidden away, interact with information sharing. We're increasing comprehension by 60% on average, we're decreasing time to comprehension by an order of magnitude. And when you put those two together, that's a massive uplift and we're increasing the propensity that individuals have to willingly share by eight times. Now that's actually the thing that gets organizations excited because they go," Wait a minute. You're telling me that we could potentially gain eight times the access to our customers that our competitors have? Wow." We can actually do things that are so much more compelling than we can do today.

Jon Prial: I'm particularly intrigued by this propensity to willingly share. Those are great results and it could be a game changer with companies no longer having to be so sneaky about how they capture information. So what do you say about companies and how they might provide value in return for this information, whether it's monetary or otherwise? So most of the monetization opportunity at an individual consumer level, and by the way, data is far more valuable in its aggregate form organizations, but let's push that to the side. The value of an individual's data, even if it is literally 360 degrees of their life, which is totally possible, particularly if you're using something like a personal information management services, a DigiMe or you're trying to employ a personal data store through solid pods, etc. It's so limiting, right? What we risk doing is further reinforcing the wealth divide and enabling people who are already fairly wealthy to effectively make more wealth and further disenfranchising the people who basically aren't a value to advertisers. And personally, my moral philosophy just is challenged by that.

Nathan Kinch: So I agree that the individual level is difficult and the aggregation is obviously where the value comes in and we could get someday. And there are presidential candidates in my country, which is not Canada, not Australia, but are talking about an equivalent of the Alaska oil dividend, where transactionals are every Facebook transaction or every transaction gets taxed at a fraction, fraction, fraction of a penny, but in aggregate money flows down to individuals. That's a different way of aggregating the data and in aggregate sharing the value. There's a couple different ways to get there. But you're still... I liked that you talked about incrementalism and network kind of on that journey to see where we get to. So what are your thoughts about kind of those, the both ends of that spectrum there?

Jon Prial: So we think there are multiple iteration cycles that we have to go through. We have to evolve our incentive structures. We have to come together as a society and start having the big discussions. And we are, we really are about what's right, and what's wrong. What's acceptable, what isn't and we have to optimize for social preferability. So when we build out operational data, ethics frameworks, one of the activities that we engage in and embed into that consistent process through which an organization decides documents and verifies how their data processing activities are socially preferable. And that's the definition of a data ethics framework is social preference testing or social preferred, but we call it social preferability testing because in social sciences, social preference testing means something. So social preferability testing. And basically we're looking at a specific activity, the data processing activities that enable that thing to come to life, the intent, the outcomes, etc. And we're looking on a you could think of it rather simply as Likert scale on this side, completely unacceptable this side, I'm absolutely stoked by what's going on. Somewhere in the middle is acceptable, the acceptable barrier we're optimizing for this side of the standard deviation bar to the right. Yeah. We want people to be stoked about what we're doing. Technology should augment us. It should free us from meaningless interactions, not lock us into, to meaningless interactions, just so that some company somewhere makes a couple bucks on advertising revenues. So again, we need to evolve incentive structures. We need need organizations to really like stand for something they need to fight for what they believe.

Nathan Kinch: Well, I'm certainly a fan of avoiding meaningless interactions. Thank you for that. So do you see companies or even a company doing this reasonably well today? That'd be great if you could just somehow bring this to life for us.

Jon Prial: Good examples of that. Apple's by no means perfect, but they're taking a much stronger stance right now. I really love that there are things that they have to work through. There are things that they have to evolve over the coming years. I would argue to really live and breathe some of their marketing messages, but, but this is progress, right? And it should be commended because it's hard to make progress in, you know, in an information economy that doesn't today really value. The things that Apple is is, is trying to expose. It's really hard for organizations to ignore incredibly compelling bodies of evidence that showcase that you can protect and empower your customers more effectively whilst maximizing the business opportunities that your organization assigns value to it. So you don't have to do it overnight. You don't have to do it all overnight. You can conduct these systematic experiments, showcase, uplift, build a body of work evidence, and utilize that to gain support across your organization.

Nathan Kinch: Perfect demonstration of a win- win. That's great. Nathan Kinch. Thank you so much for taking the time to be with us today. This was really interesting. Thank you.

Jon Prial: My pleasure. Thanks so much for having me again.

DESCRIPTION

On previous episodes of the Impact Podcast, we’ve talked about Privacy by Design with Ann Cavoukian and Privacy as Trust with Ari Ezra Waldman. In this episode, we complete the trilogy by discussing Trust by Design with Nathan Kinch. 

Trust can be a difficult topic to pin down and make actionable. Nathan and Jon discuss how Nathan has built a toolkit called Data Trust by Design to provide a systematic method through which organizations can design for trust and actually be worthy of it. 

You'll Hear About:

  • Why customers are willing to share data when they trust the company and product
  • How you can sell more on a positive message of trust
  • What metrics trust can impact
  • How brands with trust outperform the competition

 

Who is Nathan Kinch?

Nathan Kinch is founder and CEO at Greater Than X, working at the forefront of the personal information economy.

Greater than X helps brands to evolve their culture, workflows and practices to systematically release verifiably trustworthy, data-enabled products and services to their customers.