What is the Future of Trust in the Digital Age?
Listen to Episode 145:
Episode 145 Transcript:
Chris Curran: Growth Igniters Radio with Pam Harper, and Scott Harper, Episode 145: What is the Future of Trust in the Digital Age? This episode is brought to you by Business Advancement Incorporated, enabling successful leaders and companies to accelerate to their next level of growth on the web at businessadvance.com. Now, here’s Pam and Scott.
Pam Harper: Thanks, Chris. I’m Pam Harper, founding partner and CEO of Business Advancement Incorporated, and sitting right across from me as always is my business partner and husband, Scott Harper. Hi, Scott.
Scott Harper: Hi, Pam. It’s always a pleasure to join you again for another episode of Growth Igniters Radio. And just to remind people — it’s our purpose to spark new insights, inspiration, and immediately useful ideas for visionary leaders to accelerate themselves and their company’s to their next level of game-changing innovation, growth, and success. Now, Pam, speaking of game-changing we’re all well aware that we are well into the digital age, and new business models based upon digital technologies create all kinds of new value. There isn’t any aspect of life that isn’t touched by some form of digital.
Pam Harper: You’re right, and with that comes the responsibility and accountability of looking at the issue of trust. Everyone thinks about trust in terms of who gets what information — we see it in the news all the time — but as digital technology continues to evolve we need to look at the future of trust from a broader perspective.
Scott Harper: Okay.
Pam Harper: We need to have new conversations about it, that’s why we’re glad to be speaking with our good friend, and returning guest, Jim Blasingame. Jim is a futurist who is widely acknowledged as one of the worlds foremost experts on small business and entrepreneurship. He’s an author of award-winning books including The Age of the Customer, and his newest book, The 3rd Ingredient: The Journey of Analog Ethics into the World of Digital Fear and Greed. Jim has also been a syndicated columnist since 1999, contributing weekly to newspapers, and online publications including forbes.com, nasdaq.com, American City Business Journals, and openforum.com.
In addition to all this, Jim is the creator and award-winning host of the syndicated radio program, The Small Business Advocate Show, which is also available on podcasts. He conducts over 1,000 live interviews, annually, with his Brain Trust, the worlds largest community of small business experts, policymakers, and entrepreneurs. I am proud to be a member of Jim’s Brain Trust for the past 12 years. You can see Jim’s complete bio, and listen to our previous conversations with him by going to growthingiters.com, episode 145, and scrolling down to resources. Jim, welcome back to Growth Igniters Radio.
Jim Blasingame: Hey, Pam. Hey, Scott. Thanks for having me back.
Scott Harper: Hey, Jim.
Pam Harper: Congratulations on your new book and the award.
Jim Blasingame: Oh, thank you. I appreciate that. Yeah, it’s won five international awards. We’re really proud of it.
Pam Harper: Let’s talk about it. Your book is titled The 3rd Ingredient. What is that?
Jim Blasingame: Well, The 3rd Ingredient, I sort of co-opted that from a 100-year-old short story by O Henry called “The 3rd Ingredient” that has been one of my favorites all my life. The reason that I came up with the idea is that as most people know, the whole modern human experience on the planet began being driven by two major forces. I call them the two “active forces,” and that’s fear and greed. Fear was first, and then greed came second.
Now, as most people realize, there’s nothing inherently wrong about fear and greed until you and I make it wrong. The first two ingredients are fear and greed, and over the last 10,000 years, we’ve used what I call the “passive forces” to hold fear and greed into equilibrium, so that fear and greed would serve us well. As you know, sometimes fear is giving you good information. Right? And sometimes greed — the good forms of greed — causes you to seek a better job, or have something nicer and better for your family, or whatever. Those are the good aspects of greed.
But the problem is that when those two things, fear, and greed — when they dip below equilibrium then that’s where there’s a problem. During the 10,000 years of the analog age, we have done a pretty good job of creating the passive forces, which is our catalog of ethics to hold fear and greed in equilibrium. The 3rd Ingredient is what we have to now seek, and find, and employ in the digital age.
Scott Harper: Okay. You’ve said that over 10,000 years, morality and ethics, and law have held fear and greed in check. But you’re saying now we’re in a big shift. What is that shift?
Jim Blasingame: Well, for 10,000 years the first two ingredients, fear and greed, have driven us. Those are our primal forces. I love the old saying, “Some things don’t have to be remembered, they remember themselves.” Fear and greed are like that, they don’t have to be remembered; they remember themselves. Through the 10,000 years of the analog age, whenever fear and greed reached equilibrium, we were able to use our catalog of analog ethics like trust, contracts, currency, insurance, laws, regulations, morals, ideals — all those things — we’ve used those things to hold fear and greed in the equilibrium and hold them in check. As long as we’re in the analog age that worked just fine for us.
There were periods of time where things went pretty badly, but we put things back together, we paid the bill, we buried the dead, and we moved on, but now we’re moving into the digital age, and the analog age. In the analog age, we had analog leverage. In the digital age, we have digital leverage. If you think about it, whenever there was analog leverage there was a human nearby, but now, with digital leverage, there’s a lot of work going on without humans being there to apply ethics.
The problem we have right now is in the digital age we still haven’t figured out how to apply ethics, how to install trust on for example a nine gigahertz, nitrogen cool 38 core processor, so that’s the problem, and the reason people are anxious right now is because we haven’t found a way to put trust on a nine gigahertz computer.
Pam Harper: Well, it seems like we’re applying the old styles, as you say. One of the things I liked in your book is how you illustrated this, you were explaining for instance about some of the antitrust cases, and how they would be solved. Can you tell us a little more about that? Because I think it makes it more tangible for us to think about analog ethics versus digital ethics.
Jim Blasingame: Yeah. The antitrust division of the Department of Justice has been around for over a 100 years. Back when there was an antitrust issue that the government wanted to deal with, they could apply their power and influence through the courts, and the courts would solve it in their due time. And when you came out on the other end with a resolution, the government won or they didn’t win. But it was pretty much another day, another dollar. But as we’ve grown over the last 20 years, by the time the government gets around to applying their analog justice through the antitrust department, the technology has gone around them to make whatever their findings were moot.
For example, US v. Microsoft — that lasted about two or three years. By the time Judge Jackson dropped the gavel on that, which was pretty darn fast, it was irrelevant because the technology had moved around the finding. Whatever the benefit the litigants might have gotten they’re technology made it moot That’s an example of where an analog approach to solving problems is not working in the digital age, the technology is just moving too fast.
Scott Harper: Okay.
Pam Harper: It’s an evolution, isn’t it, really?
Jim Blasingame: There’s no question about that, but let me put it this way we took 10,000 years to go from mammoth to mainframe in only a couple of generations we’re going from mainframe to mobile. Our analog ethics are so ingrained in us, we’ve had them so long that we don’t even think about being ethical. We are ethical. You’re ethical, Scott. You’re ethical, Pam. I am. We know we are, but we don’t think about being ethical, because it’s been so engraved in us.
Pam Harper: That’s right.
Jim Blasingame: And that’s the problem with the digital age, because the digital age is so different, and so much faster, we can’t enter the digital age without thinking about the ethics that we must apply in the digital age.
Pam Harper: I agree with you. It’s almost like we’re just breathing, we don’t tend to think about things the same way, that’s why I agree with you, that we have to look at trust, and the future of trust in some new ways. You know, when other thought leaders are focusing on solutions involving innovations like algorithms, artificial intelligence, big data and robots why are you focused on the ancient concept of ethics? It’s a bit of a paradox here, isn’t it?
Jim Blasingame: Well, to me it’s not a paradox, and I’ll tell you why — humans require trust.
Remember this, the only thing that’s going to enter the digital age and not change over the next 100 years to 500 years are human beings. We started out analog, we’re analog now, we’re going to continue to be analog. Humans are never going to be digital. As analog humans, we require trust, and a 100 years from now we’ll still require trust. And the reason why I think people are so anxious today is that they’re not feeling the application of ethics and trust in the algorithms that are all around us.
We’ve got a collision right now of two converging forces, the digital and the analog, and they’re converging, but they’re not the least bit alike.
Back to Scott’s question, this shift we’re talking about is unprecedented because we’ve never had to have an interface that would put the analog and the digital together. We’ve never had to do that, yet, and we’re having to do that now, and not only are we not doing it people aren’t even talking about it. We’re talking about the problem, but people aren’t realizing that the reason we’re anxious is that we don’t know how to apply our ethics that we require as analog humans to the digital leverage that we’re using.
Scott Harper: Jim, people evolved over millennia to really need to belong to groups, I have to belong to a group, I have to fit in, and that’s where if I’m not ethical, if I’m not trustworthy, I’m going to be ostracized.
Jim Blasingame: That’s right.
Scott Harper: Machines don’t have that concern. I mean, you’ve got Facebook and the Cambridge Analytica, you’ve got Alexa listening into people, and laughing in the middle of the night, these things don’t have feelings builtin, it’s not like Asimov’s, Three Laws of Robotics-
Jim Blasingame: That’s right.
Scott Harper: Back from years ago. The question is, what do we do about that?
Jim Blasingame: We’ve got to find a way to create a digital form of trust, and we have to start demanding it. We can no longer presume that the technology itself is just going to work and play with others. It’s not in the nature of machines to do anything, but what they’re programmed to do. Human beings do that programming, so human beings are going to have to program some kind of ethical behavior, some kind of boundary, some kind of border, some kind of governing. One of the ways that, that’s going to happen is going to be a blockchain type technology that will take care of part of this, not all of it, and I think there’s going to be a global gut check over the next generation, and you already see it in Europe with the GDPR. California just passed almost their own version of GDPR. We’re dealing with all this right now, and we’ve got to sort it out, and I hope my book would help get a conversation going about this because most people they don’t realize that what’s missing is the ethical component. Does that make sense?
Pam Harper: It sure does. We’re going to take a quick break, and when we come back we’ll talk more with Jim Blasingame about his book, The 3rd Ingredient, and a new way of looking at the future of trust in the digital age. Stay with us.
Scott Harper: You are with us on Growth Igniters Radio with Pam Harper and Scott Harper, brought to you by Business Advancement Incorporated. We focus on enabling visionary leaders to dramatically increase momentum for game-changing results, and we’re on the web at businessadvance.com.
Pam Harper: We’d like to welcome our listeners, and our many new listeners, if you’re not already subscribed to our Growth Igniters community you can get even more value by signing up. You’ll receive reminders of our new biweekly podcasts, along with a link to a page filled with all kinds of resources, and of links. You’ll receive a Growth Igniters post, about a two-minute read.
Scott Harper: Go to growthignitersradio.com and select the red signup now button at the top right of the page.
Pam Harper: Welcome back to Growth Igniters Radio with Pam Harper — that’s me — and Scott Harper. Scott and I are talking today with business futurist Jim Blasingame, host of The Small Business Advocate Show about his newest award-winning book, The 3rd Ingredient, and The Future of Trust in the Digital Age. Jim, how can people find out more about you, The Small Business Advocate, and of course your books?
Jim Blasingame: Hey, Pam. Thank you very much again for inviting me onto your great show. I’m so proud of what you and Scott have done with your program, here. I remember when it was a gleam in your eye, and I just want you to know how proud I am of you guys for doing this.
Pam Harper: Well, thank you.
Jim Blasingame: My website is smallbusinessadvocate.com. That’s my main site. They can find out more about the book at the3rdingredient.com, that’s with a number three, the3rdingredient.com, of course, it’s available on Amazon. I encourage people to go there and learn more about it. We’re still working on the website. The book is fairly new, so the websites still young, too, and we’re working on it, we’ll have more stuff up there soon, so just keep coming back, but that’s the best place to go the3rdingredient.com to find the book.
Pam Harper: It’s a wonderful book. We were talking in the first segment about the context of why you brought the book to life, why this is a time unlike any other, why you’re even talking about ethics, which we think is so important, and trust is going to be a very different thing. Let’s delve a little bit deeper into this whole concept of the analog and the digital. Maybe we can give examples.
Jim Blasingame: Okay. Analog is just the natural world. Right? Analog, when I say analog I’m talking about physical, chemical things. Our natural world, human beings — you and I — are part of that; we’re analog, 100% analog, so consequently the leverage that we created is analog. So a hammer is an analog lever. Right? A crowbar is an analog lever; those are all analog things. Over 10,000 years we’ve developed analog leverage to manage those, so that we didn’t use a hammer appropriately, or whatever it might be — a business practice — we’ve created what I call “analog ethics,” and apparently I coined that term. Apparently, no one’s ever used the term analog ethics before, and I think it proves the point to me that we better start talking about this, because we can’t talk about digital ethics if we’ve never talked about analog ethics.
Pam Harper: Well, so if I take that hammer and I hit the hammer on Scott’s head — no, I won’t — it only impacts Scott, it doesn’t impact anybody else-
Jim Blasingame: That’s right.
Pam Harper: It doesn’t impact you, Jim; it doesn’t impact Chris. You know, nobody else, but with digital it’s a different kind of leverage. Right?
Jim Blasingame: This is where I talk about the impact, and the collateral issues of analog leverage, and digital leverage. The impact of analog is just usually within the sphere of the energy. Right? And the collateral is the people and the things around that. In the digital age, it’s different. You remember in 2010 when we had the flash crash in May 2010; I mean, we don’t know exactly what happened, but we think one fat finger hit a button wrong, and it caused the stock market to go down 1,000 points in 20 minutes. 20 years ago, nothing in this analog stock market could have caused that, but this is why I try to remind people that every one of us with an “enter” key on a computer or a phone has the ability to create digital leverage that could be used for good, or not so much.
This is an excellent point, Pam, because until we realize that we’re entering a time where we have leverage like we’ve never had before, we can’t address this. You know, we measure analog leverage in terms of miles per hour, horsepower, RPM’s. Digital leverage is measured in billions of bits per second moving in a 186,000 miles per second, and that’s a lot faster. If we don’t get control of the digital leverage and learn how to apply our ethical behavior to that, when digital ethics reaches equilibrium we’re going to have a hard time putting that back together. That’s the reason why I came up with the concept of The 3rd Ingredient being a preemptive force that would hold digital fear and greed in check.
Scott Harper: Jim, you mentioned earlier that because analog ethics don’t apply to machines, we have to have some sort of digital ethics. You referred to blockchain, or things like that, can you amplify that a little bit more? If machines don’t have feelings and don’t care, how can we install digital ethics that are not dependent upon human beings doing something of their own free will?
Jim Blasingame: Well, of course, machines don’t make themselves, they don’t turn themselves on, they do whatever we tell them to do, including the digital ones. So whatever ethics we want our digital leverage to have, we’re going to have to agree to that, decide on it, be devoted to it, and find a way to apply it to our digital tools, to our digital leverage. This is why it’s really important to understand the ethics that got us here are going to be the same ethics we’re going to require in the future, but the form of ethics that got us here — analog ethics, won’t be enough.
You talked in the last segment about antitrust law; that form of ethics is not fast enough to deal with the digital leverage. So we’ve got to find a way to create digital ethics. Remember this; write this down; folks this is going to be on the test — ethics must move at the speed of the leverage, whether its digital ethics or analog ethics.
Pam Harper: I would agree with that 100%, because we’re seeing it every day in real life. One of the things I liked about your book was that you show this through stories, and you’ve got the historical side of it through characters, that’s why you call it the journey. Right? But you also show where it could go, and you have different scenarios of what could happen. What got me was that if you look at it that way, and you look at the fact that we’ve always really been evolving in our ethics to a certain point, we’ve had to come up with new forms of trust, whether we went from bartering to money to other things we’re always having to look at how do we develop trust?
Jim Blasingame: Mm-hmm (affirmative).
Pam Harper: Can you tell us what that means with an example?
Jim Blasingame: As analog human beings, and inhabiting an increasingly digital age we’re still going to require trust the way we think of it now, but we got to make trust go faster, and remember that the ethics have to move at the same speed as the leverage, and when the leverage moves at a billion or nine billion bits per second at a 186,000 miles per second, we’ve got to come up with a method to make that happen, and that’s where we’ve got to focus.
We already know how to make machines go fast. We don’t really need to spend more time working on making machines go faster, I don’t even think we need to spend more time working on how to make artificial intelligence be more intelligent. This is what I think is going to come to a head before long — we need to slow down, not go moving forward with anything until we can apply ethics to it. We’ve already gone too fast, and it’s because nobody has called Facebook’s hand up to now, nobody’s called Google’s hand up to now, nobody’s called Twitter’s hand up to now, or Amazon, or whoever.
We haven’t called their hand on that, and I think this is going to achieve critical mass before long, and everything’s going to slow down a little bit, I think until we can find out how to apply digital trust, digital ethics to our leverage. By the way, folks, some of the things I’m saying sounds a little egg-headed; the book, as Pam said, is all about characters, and you’re going to love the characters; they’re going to lead you through their lives, dealing with these issues, and I think you’re going to like that part of it.
Pam Harper: Jim, will we ever be able to achieve digital trust?
Jim Blasingame: We will be able to as long as human beings who created the digital leverage demand that we have digital ethics, that we have digital trust, but first we have to recognize that we actually need that. And this is the unprecedented part — we’re not prepared for making a major shift in leverage from analog leverage to digital leverage because we presume that since humans are creating the digital leverage then the digital ethics will be there, but it’s not. It’s actually devoid of ethics.
Pam Harper: Well, awareness is the first step, and at least we’re beginning to grapple with it. What we’re going to do now is take another quick break. And when we come back, we’ll talk more with Jim Blasingame, author of The 3rd Ingredient, about actionable steps you can take to increase trust in the digital age. Stay with us…
Scott Harper: This is Growth Igniters Radio with Pam Harper and Scott Harper, brought to you by Business Advancement Incorporated. On the web at businessadvance.com.
Pam, we’ve been talking about the need to create new levels of ethics and trust that will allow all of us analog humans to operate more safely and effectively in this evolving digital age. Now, in both our personal and business lives this requires digging deep and having a lot of really profound conversations.
Pam Harper: And frequently this will bring us face to face with confronting challenging issues that everyone knows are there, but nobody wants to face. That is the elephants in the room. But leaving them unaddressed can severely undermine trust, and lead to a whole range of unintended consequences.
That’s why we’ve written a Harper Report called “Taking Control of the Elephants in the Room.” This is one of our more popular reports, because it’s practical, and addresses an issue that every leader and team faces at one point or another, especially when we’re moving into uncharted digital territory.
Scott Harper: Go to growthignitersradio.com, select episode 145, and request your complimentary copy of the report, How to Take Control of the Elephants in the Room …
Pam Harper: Welcome back to Growth Igniters Radio with Pam Harper and Scott Harper. Over the last two segments, Scott and I have been talking with Jim Blasingame, author of The 3rd Ingredient and host of The Small Business Advocate Show, about the future of trust in the digital age. Jim, remind us how people can find out more about you, listen to The Small Business Advocate Show, and more, and of course, about your books.
Jim Blasingame: Thank you, Pam. I appreciate very much, and I’m enjoying our visit. My websites are smallbusinessadvocate.com — that’s how you can find all the things that we do on our show, my writing, my newsletter, all of that, including about the books, and you can listen to my show there, and you can hear thousands of archives, including, Pam, all the interviews that you and I’ve done over the years on my show.
As far as the books are concerned, go to theageofthecustomer.com, and also the3rdingredient.com, that’s with the number three, that’s the title of the new book. That’s still a young website, we’re still building it out, so go over there, and check it out, And of course, they’re available wherever books are sold.
Pam Harper: And you can access this also by visiting growthignitersradio.com, episode 145.
Jim, you say we can’t count on the Facebooks, the Googles, the Amazons to operate ethically, and it’s up to each one of us. But how is that possible, and how can we make a difference?
Actually, I’m going to say the first thing I know, which is the first thing that we can do practically is read your book because we have to get that awareness. So that’s the first thing. I’m going to just throw that right on out there. What else, though?
Jim Blasingame: I have a little two-act vignette that I like to tell people, and it helps people understand the difference. It’s what I call “what’s more dangerous than an algorithm, or a robot?”
Jim Blasingame: The first act: Scott, you’re sitting at an alfresco café drinking a latte, and you’ve just found the sexiest app you’ve ever seen in your life; this is the most awesome app you can’t wait to download it, and you’re just about to push the button-
Scott Harper: Yeah.
Jim Blasingame: To download it, and somebody walks up and interrupts you, and says, “Excuse me, sir,” and you look up and say, “What is it,” you’re a little irritated, and he says, “You know Jim Blasingame, don’t you?” And you say, “So what if I do?” And he says, “I understand that you know him. I want to contact him. I want you to give me his contact information, which I think you have.” Now — Freeze frame. Are you going to give him my information, Scott? You don’t know him. He just walked up. No? Why aren’t you going to do that?
Scott Harper: No, because I don’t know who he is. I might take his information, and convey it to you, but I’m not going to give him yours.
Jim Blasingame: But you wouldn’t give it to him, because it would be unethical. Because I would expect you not to; because you would expect me not to do the same thing.
Pam Harper: That’s right, trust.
Jim Blasingame: All right. That’s called the analog story. Now, let’s roll the tape again into the second act, whereupon you turn aside, and you look at your phone, and on the screen it says, “if you download this app for free, we’re going to come and get all your stuff — your contacts, your location, your photos — we’re going to come get all your stuff. And you hit the button anyway and download that app like most of us have done, many times. And you just gave away my contact information, and thousands of others, whoever is in your phone, whoever’s listening to this, we’ve all done it, you’ve just given them all of that information, and access to your phone to this person that you don’t even know who’s built this sexy app. What just happened?
Scott Harper: Sounds like Cambridge Analytica.
Jim Blasingame: That’s exactly right. Yes, that’s a perfect example. In a matter of seconds, we passed the analog ethics test and we failed the digital ethics test.
Scott Harper: Oh, my.
Jim Blasingame: In a matter of seconds. See, this is because we don’t think about that. So here’s my point; you mentioned the Cambridge Analytica thing. See what I just said about access to my phone — that’s actual language from an app that I wanted, that I was going to download on my phone, and I didn’t because the app didn’t require access to my contacts, that wasn’t what the app was all about, but they were going to come get them anyway. But I read up on it, and I found out what they were going to do, and I chose not to download that app, so Pam you said, what can we do? We can say: “Just because I can doesn’t mean that I should.”
Pam Harper: Okay, and what I also hear is we have to read those agreements, we can’t just gloss over them; they’re all in legalese, although, now with GDPR, maybe a little better, a little clearer, but we have to know that it even exists that way. So definitely read before you press the enter key.
Jim Blasingame: Can I say one thing real quick? How many people are mad at Facebook for letting Cambridge Analytica do what they did, when the truth is the real culprits were the thousands of people who downloaded that app without thinking about it.
Pam Harper: So We have to assume our own accountability and responsibility. It’s a partnering of sorts.
Jim Blasingame: That’s right.
Pam Harper: Jim, a number of our listeners are leaders of companies that are intent on being the disrupters, not the disrupted. What kinds of conversations should we be having about this?
Jim Blasingame: Pam, that’s the most important question of the day. Going forward, I think we all have to realize that if we don’t solve this digital ethics issue, there’s going to come a time when the economy is, I think, going to grind to a halt. We’ve got two issues going on right now in America. One is, we don’t have enough workers. If we don’t find a way to have enough workers, I think that’s going to hurt the economy.
And if we don’t find a way to solve the digital ethics issue, the same thing is going to happen. Now it’s not going to happen tomorrow, but it’s going to happen in the next few years, and so as a leader of a company, we all need to say, “okay, as we increasingly use artificial intelligence, the way we go to market, let’s make sure that our technology has an ethical component and that we haven’t been asleep with a switch with regard to ethical behavior, and that we require that of the people we do business with and the companies that we serve, and who serve us.”
Here’s one more thing — I’d like to see companies become digital ethics thought leaders. I’d like to see companies writing white papers and blog posts about their digital ethics, and what they’re doing. See, I think this is going to become a standard. You’re familiar with the quality movement? I believe there’s going to be an ISO-type requirement — standards — that are going to be required. If you’re going to do business with me, you have to show me you’ve got a certain ISO 9000 type of digital ethics standard, and that you complied with that. I think that’s going to be what we’re going to do in the future.
Pam Harper: That’s a very important point. I want to repeat, again — because I love your question — which is, just because we can do something, should we?
That’s where we’re going to leave this discussion; there is so much more that we need to talk about, but the time has gone by. Jim, thank you again so much for being our guest today.
Jim Blasingame: Thank you for having me, Pam. Thanks, Scott. I’ve enjoyed it very much. Keep up the good work.
Scott Harper: Thanks, Jim, and thanks to all you out there for listening to Growth Igniters Radio with Pam Harper and Scott Harper. To check out resources related to today’s conversation, share on social media, read Jim’s bio in the episode transcript, or open a conversation with us go to growthignitersradio.com and select episode 145.
Pam Harper: Until next time, this is Pam Harper…
Scott Harper: And Scott Harper…
Pam Harper: Wishing you continued success, and leaving you with this question to discuss with your team.
Scott Harper: What new conversations about ethics and trust should we be having as we develop and use our own digital technologies?