Artwork

Aryel Cianflone에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Aryel Cianflone 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.
Player FM -팟 캐스트 앱
Player FM 앱으로 오프라인으로 전환하세요!

The Future is Ethical - Tristan Harris, Center for Humane Technology

44:23
 
공유
 

Manage episode 237367343 series 1383655
Aryel Cianflone에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Aryel Cianflone 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

Welcome to the third season of Mixed Methods!

This season will be full of people and conversations aimed at helping researchers think more deeply about their practice. In this episode, we’ll hear from Tristan Harris, a world renown design ethicist first at Google and now at the Center for Humane Technology, which he co-founded. The Center’s mission is to make technology more humane by starting a conversation about the ways in which tech often ends up unintentionally harming users. Tristan offers context and suggestions for how researchers can not only make products usable and useful, but also ethical.

Listen Now

Interview Transcript

Aryel Cianflone: Welcome to the third season of Mixed Methods. This season will be full of people and conversations aimed at helping you think more deeply about your research practice. I decided to spend the first half of the season exploring the future of research. Today, in part one, we'll hear from Tristan Harris, a world-renown design ethicist, first at Google and now at the Center for Humane Technology, which he co-founded in 2018. The center's mission is to make technology more humane by starting a conversation about the ways in which tech often ends up unintentionally harming users.

Aryel Cianflone: I can't think of a more important topic for us to consider as researchers. As we continue to advocate for our users and as our field continues to grow, I see researchers becoming a powerful force for not just making our products usable and useful, but also ethical.

Aryel Cianflone: Today's episode if brought to you by Dscout, the tool that enables teams to do in-context field work without leaving the office. Dscout connects you with people via their smartphones and allows you to handpick recruits, design diary studies, conduct live interviews and access the moments that matter. Learn more at Dscout.com/mm.

Aryel Cianflone: This is Aryel Cianflone, and you're listening to Mixed Methods. Today's episode: the future is ethical.

Aryel Cianflone: It's such a privilege to have the chance to talk with Tristan Harris. I've been following your work for so long. I thought just to get started, maybe you could briefly introduce yourself and your work at the Center for Humane Technology.

Tristan Harris: Yeah, sure. Thank you for having me. So oftentimes, when I tell the story of my background, it actually starts when I was a magician as a kid because magic is a very different way of looking at people. I mean I did because I was a shy kid looking for easier ways to connect with people and having an excuse to talk.

Tristan Harris: But I was always astonished when I look in the mirror and you're doing like a coin trick or something like that, and you're doing something incredibly simple like you think that you're passing the coin from one hand to the other, but you're not. It's just so, so, so simple. And I would watch how something that I was for sure thinking would not make it through the deception filters of the other person's brain, but it would work every time. And that led me to realize that we have this sort of overconfident view of how our minds make meaning and how they see the world and how cause and effect can be manipulated. And so that was my first entrée into really seeing that the mind works differently than we think. I did a magic show when I was I think in sixth grade for my elementary school.

Tristan Harris: And then, I studied at this lab later, jumping way ahead at Stanford called the Persuasive Technology Lab that is somewhat famous now maybe. Contrary to popular belief, it's not a lab where they diabolically train you in how to manipulate human nature. It's just a lab that studies everything that we know about the psychology of persuasion. It's like taking an advertisement class or taking a rhetoric class. I mean those are the conscious levels of language persuasion, but then really, you go up and down the stack, and you get clicker training for dogs, Pavlovian rewards, you get ...

Tristan Harris: So the founders of Instagram and I were both in that class, Mike Greger. There, I learned a lot more about the social psychological persuasion, that if you lined up all the tools of your persuasive weapons or arms, there would be nudging and subtle color changes and things like that one side, which are the very weak persuasive tools most like behavior change, nudging. But if you go deeper, you get social persuasion.

Tristan Harris: That's where social media gets really dangerous is that it taps into our social meaning-making and social validation and approval, social reciprocity. That's like email where I feel like I have to get back to all these people or I got tagged in a request, I have to answer that request. I think that that person invited me on LinkedIn, I have to get back to them. These are really persuasive in a whole new order of magnitude. It's like going from non-industrialized weapons to industrialized weapons.

Tristan Harris: So that was my foundation for thinking about how technology is working. It's persuading us at a deep level that's often invisible. In the same way that only magicians notice what other magicians are doing, seeing the world through a lens of persuasion is a very different way of looking at technology.

Aryel Cianflone: Partially, I feel like listening to your description of our kind of inherent human weaknesses, I feel like those are on the other side as well with technologists where we often overestimate individuals' behaviors or individuals' capabilities to kind of make a choice for themselves, even though we already have kind of stacked the deck against them.

Tristan Harris: Right. Yeah, it goes on all sides. I mean the fundamental thing that we're trying to do is say: human nature doesn't work in the way that we think it does. It's not that we're weak or that we're ... It's like weak little race or something like that. It's not that. It's just that, and this was in the beginning of our big April 23rd event that we held in San Francisco, and the focus of all of our work is this E.O. Wilson quote who's the father of sociobiology: that the fundamental problem of humanity, that we as a species have to figure out, is that we have paleolithic emotions, we have medieval institutions and we have God-like technology. So we have ancient paleolithic brains, we have 18th century governance, and we have nuclear weapons, Facebook, and narrative warfare.

Tristan Harris: So this ... And they operate at different clock rates. That's the important part about this quote is that your brain is fixed. It's like running Windows 95 and never getting an upgrade, right? So it happens to work at its base level in a certain way. And then the medieval institutions get an update every four years when you get some new people elected or something. But then our techy is ramping up at an exponential rate. And so whatever issue we're worried about now with social media are small compared to the speed and acceleration of things that are coming.

Tristan Harris: So when you zoom out and say, "Okay, this is really the problem we have to solve," which is our paleolithic brains, which were built for gathering berries and being with tribes in the savanna or whatever, are not built for climate change. Imagine, we used to make this joke, Aza and I, that looking at climate change as a human being is almost like a deer looking in the headlights. It's just too big. It's too big for our brains. And when you look at as technology designer and say, "Okay, with this design choice, I'm going to affect two billion people." Show me the part of your brain that was evolved to give you the capacity to imagine what would happen to two billion different people with 200 different languages and different cultures if I make the newsfeed work this way or that way. We just don't have that function in our brain.

Tristan Harris: And so ultimately, this is about recognizing how to realign technology with our own minds and limits. And the good news about this is that we're the only species that even has the capacity to study and understand our own limits, like that we can understand the ways that our minds get deceived. It's not really just about deception. When I say it this way, you might think, and often people think, and the TED people when the first TED talk came out, the first title they gave it was: the manipulative tricks that companies play on your brain, which makes it sounds like it's this kind of lightweight coin trick that maybe LinkedIn or Facebook or Snapchat are just like fooling you here and there. And this is just so far from the truth.

Tristan Harris: It's really more like this sort of civilizational mind influence, mental influence machine, which might sound too aggressive until we go into the details of why that's actually true. But think of it this way, that two billion people wake up in the morning and from the moment you wake up and your phone's buzzing at you from your alarm and you have to turn it off, to the 80 times during the day you check your phone, to the moment you set your alarm at night and then actually still end up checking social media anyway before you go to bed because we all do it, we have people completely jacked in.

Tristan Harris: I mean more than the size of Christianity in terms of number of people are jacked into Facebook. Facebook has 2.3 billion users. That's more than the size of Christianity as a, just a comparable psychological footprint. YouTube has more than 2 billion so. Well, we say it's more than the size of Christianity, more than the size of Islam. Nothing about the content of those religions. But people don't even have an empathy for what that really entails. Everything from "I'm late to my morning meetings." That thought doesn't just show up in your mind. It shows up because email and calendaring together make up for the psychological construct that gets pulled over your eyes like the matrix.

Tristan Harris: Once you see it that way, you have a very different view of what has to be done, which is to say, in 2019 with impending threats of climate change and inequality and other things, what ought to be a sense-making and choice-making layer that we should pull over our eyes, and what is our obligation as designers to do that?

Aryel Cianflone: I feel like there's this question for me of what is the individual's responsibility and what is the responsibility of the individuals that make up these organizations because these same biases exist in both, and I wonder how you think of that. Is it the individual's responsibility to turn off notifications or is that just, it's so small, it's just like a drop of water in the ocean of distractions that we have today, that I wonder how you think about kind of the responsibility on either side?

Tristan Harris: Yeah, well, I mean it's, the thing that creates ... I mean once we understood behavioral science and behavioral economics and we started realizing that people don't just choose their way through life because that would take a lot of energy and research and being informed and all that, but instead, most people operate by the default setting, so they don't even know that there's a different option, right? That's true at a very deep level, like a spiritual level even. You just sort of wake up in your identity instead of saying, "Oh, what would be a different item I could choose from the shelf space of my mental identity I want to try on today?"

Tristan Harris: But it's just true at a very deep level. The simplest example of this came from drivers license studies, right? If you give people the option to become an organ donor, and rather, if you look at this chart of there's like, I think it's like 100 countries that ... The majority of countries do not become organ donors, and there's a small number of countries, mostly in Scandinavia I think, that do become organ donors, and you ask like what's the difference? Are they just more generous? Are they better? Are they more charitable and other-centered, empathic, compassionate country with different culture, or it's actually just that those are the countries where in the drivers license registration form, the default setting is that you give up your organs in case you're in a car accident.

Tristan Harris: This just shows how much the world is really run by default. And so when it comes to technology and the fact that most people don't really question the technology that they're given. Their phone just shows up in their hand, and they hand it to their kid, and they hand YouTube to their kids, and they don't know how and why it's designed or if YouTube has ... I mean surely, it's not designed by evil people. But they're not going to change the default settings. And so one of the simplest standards that we can apply to technology is: what is the default setting that I would happily design and give to my own children to use?

Tristan Harris: One of the famous lines that we use is: the CEO of Lunchables food didn't give his own children Lunchables, right? It's a billion dollar a year product, food product line for kids, and he didn't let his own kids eat it. So it's a very simple moral standard to say: what would we want our own children to use frequently? And designing products for that standard would eliminate half of the problems we probably see in technology today.

Aryel Cianflone: And what do you think is the, kind of the challenges that are preventing us from doing this now or why haven't we just naturally made the default something like that?

Tristan Harris: Well, I think the real question you're asking is: why aren't we just designing what's best for people?

Aryel Cianflone: Yeah.

Tristan Harris: Why aren't we even thinking that way on a daily basis?

Tristan Harris: So the first thing is the advertising business model. When I say advertising business model, it's probably better to reframe it as the engagement business model because it's not the rectangle of the ad that's the problem. It's the incentive to say, "I cannot allow your brain to be free from your relationship with me. I, like a drug dealer, must create a obligatory loyalty relationship where I have to crawl down your brainstem and create that loyalty where you have to come back every day."

Tristan Harris: Now critics of our work would say, "Oh, come on. Aren't there times when people consciously want to come back to something every day?" In my worldview, everyone is manipulated all the time by everything. I think the point is that people don't realize the extent to which this manipulation exists and how long it's existed. In the attention economy, it used to be that we had to get your attention, so everyone's competing to nudge us and notifications and infinite scroll and things like they're just light tricks to keep you there. But that wasn't enough. It's much cheaper if, instead of trying to get your attention, I can get you addicted to getting attention from other people.

Tristan Harris: And so that means if I add, for example on Instagram, the number of followers you have. So now, you have a reason to come back every day to say, "Well, did I get some more followers?" More importantly, it creates the social dynamic where now, people are following each other all the time, they're always are new followers, and they're always curious, and you're always following other people. And so suddenly, you create a whole culture of narcissism where everyone is addicted to being an influencer and having attention from other people.

Tristan Harris: That's what this race for attention is about. That's why the engagement or advertising economy is so problematic. It's not because of evil designers wanting to diabolically manipulate your brainstem. It's not that at all. It's just, the banality I think makes it even more sinister, which is that it's good people who are caught in a game theoretic race to the bottom. We call it the race to the bottom of the brainstem, to light up more and more parts of your nervous system because if I light up more parts of your nervous system than the other guy, I'm going to get more of your attention.

Tristan Harris: The problem is that this creates a connected system of harms that we have to recognize as one whole system like climate change. Like imagine a world where climate change is happening, and people just don't see it. They only see polar bears. They're like, "Oh, my God. We lost all these bears. We lost another polar bear." That's what I see when people talk about screen time. Talking about all of these issues of the attention economy in terms of hours on screen is like talking about the number of polar bears with climate change, instead of talking about a billion climate refugees, permafrost melting, methane bombs, a whole bunch of dark stuff that's rally a serious risk.

Tristan Harris: So with technology, those interconnected harms are shortened attention spans, downgrading our attention spans, downgrading civility because outrage works better. Why does outrage work better and why is polarization happening? Because in the attention economy, short burst uses or your attention, so quick short brief attention bits are going to be better at getting your attention than demanding like do you want to sign up for this next two hour long chuck? That's harder. So that means that we're in this race to the shorter, briefer thing, which is why Twitter won that race to the bottom in terms of brevity.

Tristan Harris: But the problem is the world's increasing getting more complex. So to talk about anything that matters at any level of richness or productivity or constructiveness would take a long chunk of discussion to get to that complexity. But instead, you have the presidential debates where you say, "Iran, North Korea, and nuclear weapons. What is your answer? 30 seconds," right?

Tristan Harris: And so what that intrinsically creates is polarization because now, if I can only say simple things about complex topics, I'm only going to get some small percentage of people agreeing with the simple thing that I said because it won't map to the full complexity of the underlying territory. And so there's this whole interconnected system of harms that we call human downgrading. But just think of it like social climate change. It's an interconnected system so that shorter attention spans equals more polarization, more outrage, more fear, more isolation because it's better for the attention economy to have you by yourself on a screen addicted, spending more time with your esophagus compressed at 45 degrees, not breathing, and then more isolation means you're more vulnerable to conspiracy theories. Conspiracy theories works better in the attention economy anyway, but now that you're isolated, it works even better. Saying the media is lying, which YouTube by the way does, not intentionally, but it discovers that there's this pattern that saying the media is lying is actually really good for YouTube.

Tristan Harris: Think about the perfectly omniscient, brilliant AI of YouTube. Imagine some future 10 years down the road. It doesn't know why the phrase the media is lying is good for watch time, but if you say over and over again, the media is lying, intrinsically, that means people don't go to regular media channels, and they're more likely to spend more time on YouTube. So if you zoom way out, distrust in institutions is actually also good for these AIs that are calculating what's good for us or what's good at keeping our attention. Critical distinction.

Tristan Harris: So that's zooming out what's going on here is that we have ... The problem really emerges from a race to capture human attention, and because there's only so much, and it takes nine months to plug a new human being into it and grow the size of the attention economy is the joke, you have to get more aggressive, and it becomes, you have to frack for attention, and so it's better off having you split your attention into four different streams, so now you're now partially paying attention to your tablet, your TV and your email and your Facebook at the same time. So now I'm selling slimmer slices of your attention to more advertisers so I can quadruple the size of the attention economy, but it's kind of the subprime markets where I'm selling fake clicks, fake users and fake attention to advertisers, and so this just isn't good for anybody.

Tristan Harris: And because the attention economy is beneath all other economies, it's beneath all cultures, it's beneath the regular economy because where we spend our attention is what makes up our politics, our elections, our mental health. Even when you're not looking at the screen, your attention spans have still been affected by the attention. I don't know about you, but most people I know can't even get through books any more because we can barely focus for long periods of time.

Tristan Harris: So this is a huge problem that, again, like social climate change, it's the climate change of culture. But the good news is that ... The bad news is that like climate change, it can be catastrophic. The good news is, and this is why we're here, is that only about 1,000 people in Silicon Valley have to change what they're doing to prevent all this from happening.

Aryel Cianflone: Yeah, and I do feel like obviously, as you go through this, it's overwhelming, it's so problematic, it's scary. It's all of these things, but-

Tristan Harris: And notice that too, like that reaction, right? So there we are, are our brains like on the savanna, 20000 years ago, were they evolved to hear the sentences that I just-

Aryel Cianflone: Yeah, the negativity bias that we have.

Tristan Harris: Yeah. Right. Well [inaudible 00:18:56] negatively-wise, but also like we just laid out a whole complexity. Were our brains evolved to see huge amounts of complexity to say, "Yeah, let's go do something about that." Or are our brains evolved to say, "There's a whole bunch of complexity that's negative. I'm going to shy away and put my head in the sand, and go back to watching Netflix because that was way too scary."

Aryel Cianflone: Yeah, take the ostrich approach.

Tristan Harris: Yeah. Yeah, and so recognizing that, though, we're the only species that could notice that that's what happened in the face of such complexity and negativity and say okay, so what tends to be the kind of thing that makes people feel solidarity and give people, that converts that learned helplessness into agency. That's what we need to get really good at.

Aryel Cianflone: So my next question is so often when I hear you speaking, you really focus on designers and the design process as someone in design research. Researchers have got to be such an important part of this because we're the ones who bring humans into this whole equation and expose them to these products teams and these different type organizations, but yeah, what do we do?

Tristan Harris: Well, notice obviously, because of the business model, I mean even my story, right? I didn't finish ... I mean I guess the other chapters of my life where I landed at Google as a design ethicist, actually through them acquiring a company that I was a part of called Apture. I landed on the Gmail team. And even with Gmail, so I was a product manager on Gmail. Even there, Gmail's business model is not "Let's get people hooked to email, and they can't stop checking and maximizing screen ... " They don't do that at all, right? But there are some innocuous metrics that say, "Well, we do care about how engaged people are, and we certainly want more Gmail users." And the main reason for that is the thing ... How much more money do you think Google makes off of a signed-in user versus a non-signed-in user?

Aryel Cianflone: On Gmail?

Tristan Harris: Sorry. On search. Google search.

Aryel Cianflone: Oh, I've never thought about it. Thousands?

Tristan Harris: So the point is that a personalized search make more money off of a search than a non-personalized search. And guess what the number one reason why you're signed into Google for a personalize search might be?

Aryel Cianflone: Because of your Gmail.

Tristan Harris: Because of Gmail. So that's sets up a reason and a business reason, a business rationale for saying, "We need you to be jacked in." Now it doesn't mean we want ... There's no again addictive designers at Gmail trying to get people to do this, but let's say per your question, there we are. We're UX researchers on the Gmail team. We're like, "Okay. Let's minimize how much time people spend on this thing. Let's give people the most peace of mind. Let's broadcast or make transparent within an organization people's level of email overload compared to usual, so that when you type in someone's name and it auto-completes the pill of their contact into the address field, it shows a little color saying how overloaded they are in their average response time." So that would cool off some of the intensity of expecting immediate responses and all that stuff.

Tristan Harris: Well, there are a bunch of things that they could do, but ultimately, I think, especially if you're a designer of one of these social media engagement products, you don't really get that choice to minimize how much time. I mean imagine Facebook could just be about helping you spend time with your friends. It could just be that. That's it. It could help people who want to go on dates find salsa classes. It could help people who want to be at dinner tables with rich conversations be at the next dinner table. It could help people organize climate resilience, urban farms and gardens in their cities. It could help people do all these things. Just like totally empowering, strengthening the social fabric outside the screen. But what's the reason why Facebook doesn't do that?

Tristan Harris: Their profits comes from ...

Aryel Cianflone: Advertisements and time spent on, yeah.

Tristan Harris: Advertisements, which therefore, on the screen. So this is where it's so invisible. It's like if you talk to the Facebook policy team or Nick Clegg or Zuckerberg, and they'll say, "Well, we tested it without the ads, but people like the ads." The point is it's not the ads. It's that the incentive of keeping you on the screens at all is what is exacerbating this whole problem.

Tristan Harris: Now, it's important we also say that even if you're a designer or a UX researchers at Netflix, your business model is not keeping people on the screen. It's the subscription. You just have to pay that $8 a month, but they still maximize for watch time because that tends to be correlated with whether or not you'll cancel. Overall, we just don't want a system where each company and product are maximizing for engagement. Maximizing in general is a bad optimization.

Tristan Harris: And the thing that could give people hope here is that companies like Apple who make iOS and specifically, the Android team at Google, or Siempro, the Android alternative launcher and other alternative launchers for Android, are in a position to redesign the incentives of the attention economy. So imagine they just kick out of their app store everyone who's trying to maximize screen time saying that's just like a fossil fuel company. We don't want those in our attention economy. Those are the extractive attention economy, and we want to only include the ones whose business models are helping people get to the places that basically make life worth living.

Tristan Harris: Now that sounds normative or judgmental on my part, but only until you unravel all of the incentives and show how much every single apps design is basically self-dealing and extractive for their own interest. A quick way to get through this is just to ask people for say, LinkedIn or Twitter or whatever or Facebook, what are the most lasting like I would be proud of that on my death bed sort of choices that they tend to enable, right? Like maybe with LinkedIn, it's like, "I found that job that I was really looking to get," and with Twitter, it was, "I was in the same city during that conference, and I ended up with meeting up with drinks with my intellectual hero, and they were there." That's happened to me once, right? Or with Facebook, it's like who knows. You discover someone introduces you for someone on a date or something like this. And these things are great. And it would be great if the products were designed to just for wrapping around and empowering and strengthening those experiences, but that's just not what they're designed for.

Aryel Cianflone: Even when you are in field, you can't be with your participants, 24/7. But there's one thing that can be. There's smartphones. Dscout is a remote research platform levering just that, which saves you from missing the moments that matter. Set up a diary study and see your participants' daily lives in context. Use Dscout live and conduct interviews of a platform actually built for research. Bring your own participants onboard or handpick from their 100,000-person scout pool. To start connecting with more people more impactfully, head to Dcount.com/mm.

Aryel Cianflone: Well, and just on, I think my question, again, is like how do we get to a point because it feels, sitting here and listening to this, I feel like I'm also seeing this amazing future that could be that there is so much potential and so much promise in these different technologies that have been developed, but it also feels a little bit upside down of where we're at today, right? It's hard to imagine how we get there.

Tristan Harris: And how does that feel to notice that?

Aryel Cianflone: How does it feel? It feels uncomfortable a little bit. It feels sad. It just feels like I wish that we could be better, but humans so prone to short-term thinking, right? And all the humans that make up these organizations are the same way, right? They all have profit targets to hit. They all have stakeholders. They all have OKRs and different things that they want to hit. How do you get people to so fundamentally change?

Tristan Harris: Well, you said something interesting there, which is that humans have short-term thinking. It's true. We, in our own nervous system, are optimized for short-term immediate rewards. But what you really said is that the incentives are for that, and when you especially have publicly-traded companies, the pressures for short-term growth et cetera, make it impossible to makes the kinds of deep structural changes we're talking about. I feel uncomfortable raising this conversation sometimes because it's a possible world. There's nothing that is technically infeasible about what we're describing. And by the way, the more you lean into it, it's kind of amazing. It's a world where you can trust-fall into technology, and its sole purpose and design is just to be-

Aryel Cianflone: Is to help you.

Tristan Harris: ... helping people. They're literally like bumping each other's elbows out of the way being like, "No, no, no. I have an even better idea of about how to help Aryel." I want people to really imagine what that would feel like because that's at least the north star I think we could be aiming for.

Tristan Harris: Now between now and then, a lot of things have to happen, and the uncomfortable thing is the fact that we can't just snap our fingers and switch to doing that other thing. But if corporate boards were pressured by their shareholders, which happens, to say, "We have to get off this business model because we're seeing it as a long-term of source of investor shareholder risk," which it is by the way, because basically, all these companies, if they're incentivized by attention and data, it creates the long-term risk for privacy scandals and for people being aware that these companies interests are not aligned with ours and the public perception issues. If that starts to spread, as it's already doing with Facebook and Cambridge Analytica and all these kinds of things that are tying themselves into knots, it's only a matter of time before any company that's in that business model is going to have a problem.

Tristan Harris: YouTube with maximizing watch time is going to have a problem. Twitter, Facebook, et cetera, Snapchat. And the ones ... So you could imagine a world where through corporate board resolutions, through shareholder activism, through policymaking, Paul Romer, the Nobel Prize-winning economist, proposed recently a ... I think he called it a progressive advertising tax, where you basically tax companies for having the advertising-based business model. The whole point is just like fossil fuels transitioning to regenerative or renewable energy, you want to increase the cost of the thing that's causing the harm and lower the cost of the thing that we all want to live in, that will not destroy civilization.

Tristan Harris: That is the game that has to get played. I think the thing that people on the inside of these companies needs to understand is that you all have a voice and a role in that. I mean I was inside of Google, and I didn't know how these things would ever change. I mean I literally ... I saw some of these things, and I made this famous presentation sort of in ethics [inaudible 00:29:44] to me becoming an design ethicist, it basically was a call to arms to the whole company saying we have ea moral responsibility in managing a billion people's attention and especially this race to the bottom for attention.

Tristan Harris: I was super nervous in making it and then when to release it. I worked on it for about a month. It was like a 136-page slide deck without any bullet points or anything, just big images. It's actually available now on minimizeddestruction.com. Someone had found it and posted it there. So you can see it. But it was an artifact that I sent around to the whole company. Actually, it's not true. I sent it to just 10 people, and then it ended up spreading virally.

Tristan Harris: I really want to say I was incredibly nervous about what would happen, and I was nervous about the quality of my ideas because I thought I could be wrong, surely, if this was true, then-

Aryel Cianflone: Other people would've brought it up.

Tristan Harris: ... other people would've brought it up. This is when you realize there's no adults in the room. The obvious thing here is it is a kind of capitalism that's sort of limbic capitalism that the paper clip maximizer is pointed at the brainstems of people, and that's why no one talks about it because it just goes against incentives. You know the famous Upton Sinclair quote: you can't get people to question something that their salary depends on them not seeing.

Tristan Harris: Anyway. This thing spread, and lo and behold, to my surprise, it spread to something like 10,000 people at Google. It became on a internal thing called meme gen, which is the culture tracker. People post memes and vote them up. It was the number one most-voted meme, like saying we want to talk about this at TGIF.

Tristan Harris: It led to me becoming a design ethicist, and then it led to, again, me trying to ... I was hosting lunches and meetups and conversations at the company and organizing round table discussions. Honestly, it didn't go further than that because the business model was still entrenched. It wasn't that anybody said, "Hey, we still need to make money. We can't do that." It was just that's not a priority. Sure, we can make people use Android a little bit less or check your phone a little bit less, but it's just not our priority.

Tristan Harris: And I had to leave to create that outside pressure for what we now see is an unprecedented level of change, and not just because of our work. There's been so many people that have been out there. Roger McNamee, Sandy Parakilas, and [Zeynep 00:32:00] and all these great organizations and people who've been raising the alarms, but it's led to now Apple, Google and Facebook adopting in our case, Time Well Spent for their ... The reason everybody has the chart or graph of where their time goes on their phone is because of this pressure.

Tristan Harris: The point isn't that it just happens from the outside. It's actually because it creates a conversation. So you show up on Monday at a product design meeting at Twitter or Facebook, and people say, "Hey, have you heard of the attention economy and the race to the bottom of the brainstem and what are we doing about this? What about Time Well Spent? What this TED talk."

Tristan Harris: I think that if everybody can raise this conversation and you hear it three times in one day, it becomes pressure because if you think about it, where does pressure exist? I mean memetically like where's the pressure for Time Well spent? Where are the atoms? Does like a big pickup truck show up and starts pushing on the walls and pressure on the buildings?

Tristan Harris: The pressure exists through people repeating something and feeling like it's really hard to go to work if we're not doing this other thing that I know is possible. It turns out to be a lot more expensive for companies to replace demoralized employees than it is to start doing more of the right thing because people need to feel incentivized to do something different. I mean they started launching these meaningful social interactions metrics at Facebook, which are basically the Time Well Spent metrics. Like how are we measuring something other than just engagement?

Tristan Harris: And if there's enough pressure, everybody who's listening to this podcast can actually raise this conversation themselves. And the reason why we focus so much on language is that we have to have language that describes this problem statement, this interconnected set of harms that we call human downgrading, you can call it social climate change in technology, so that we're not just solving addiction. We're not just solving screen time, which is just not the right way to think about it. We don't want to downgrade our civilization's capacity at the time we most need that capacity.

Aryel Cianflone: Well, and it sounds like you have seen some large successes in terms of Apple and Google and Facebook implementing these Time Well Spent features. I wonder if you've heard stories from individual designers who have heard your message and have gone out and done something different with their product teams, or I'm really thinking of the UX researcher who listens to this, and then they go into work, and they want to do something about it. what is the best way to start that conversation or find success in this?

Tristan Harris: I think also, to answer your question, I think people get blinded by wanting to do the absolute good when just asking what's the smallest step I could take in this direction tomorrow? What a small step? What's the smallest way that the product can be nudged and designed towards these outcomes? What's the metric that I can introduce? What's the way I could have this conversation? What's the set of design principles that might eliminate the problem?

Tristan Harris: I don't have all the answers. I mean we really want to encourage all of the people listening to this to think for themselves about in what way would you nudge the company that you're working for in this direction?

Tristan Harris: But to do so forcefully, I mean I think this change can happen only with 1,000 people together realizing that no one actually wants this. I mean that's the thing. It's not like when you see this, anybody's excited about the mass downgrading of attention spans, the mass dumbification, stupification of society. And if that's not enough to motivate you, by the way, not to make it the dark side, but China will choose not to downgrade their population. So it's a competition between the West and China about which country will downgrade their population the least, given these dynamics.

Tristan Harris: I think that that can motivate all of us much like climate change can at saying: what are we going to do to upgrade our capacity? That's something that actually I regret, by the way. We haven't yet sufficiently introduced the positive frame for the opposite to human downgrading. With Time Well Spent, we saw that people were repeating this positive phrase. I mean the beauty of that, as an example to answer your question, we had let's see [inaudible 00:36:09] Class Pass and Pinterest and all these companies walking to work and saying, "Hey, guys. We want to be part of Time Well Spent, and we want to ask our engineers and designers what's the metric that you're going to invent, what should that be? Let's have a conversation about what are the design features that we're going to do differently? What are the thing people find most valuable and lasting and time well spent for their lives?" It was a positive message and frame that created a lot of interest and implementation.

Tristan Harris: We don't yet have that frame and phrase for the opposite of human downgrading. We hesitate to call it upgrading humanity because it sounds techno-utopian, which is the same mess that got us here. We've been talking about upgrading humans and human capacity for a long time, and it got us exactly to this place where things have been going pretty bad. So whatever moral framework we need, it's certainly ... has a ... It's covering a missing blind spot that we didn't take into account before.

Aryel Cianflone: Well, and Tristan, I want to go back to something that we were talking about at the beginning, which is kind of this difference between individual responsibility and the responsibility of individuals within the organization because I feel like it's such an important point, especially for researchers who are interacting with both of these groups for so much of their time. So is the idea that ... I guess thinking about somebody going into a research session, right, with a user, what is the question that you can ask that individual to understand or to help your product team understand and see that this is time well spent or this isn't time well spent? Just kind of even spit balling about ideas on that because I feel like researchers have this amazing opportunity to make these things really apparent to large groups of people within their organization.

Tristan Harris: Yeah, Jo Edelman who is one of my collaborators and he was the CTO of Couch Surfing who invented, he co-invented this phrase Time Well Spent and a lot of the design methodology from his work at Couch Surfing, where he pioneered a bunch of surveys asking people what would make their time spent with someone hosted at Couch Surfing really meaningful? They actually did a retrospective survey. So six months later, they would ask you after you stayed with someone in Paris for four days, "How was that?"

Tristan Harris: But then they would also do this six month later retrospective. They would bring back that person and say, "How do you feel about it now?" They used that long-term six-month waiting signal to rank search results. So if you typed in Paris, and you were a 20-something-year-old woman from San Francisco, you would see often, if you looked at just your click patterns, you'd probably end up clicking on people who were in their 20s in Paris. That would be your default set of choices.

Tristan Harris: But because Couch Surfing was ranking by what people said in the long run, like six months later was the most valuable, they would end up finding these patterns like Joe's example was something like a 50-year-old Iraqi immigrant into Paris was like the person that everybody loved staying with. It was this Iraqi guy who was just super jovial and really heartwarming and charming and giving and loved cooking Iraqi food and taking you out to the local pub in Paris. That's awesome. That's the thing we're trying to figure out here.

Tristan Harris: I think as engineers, we have to watch out for the tendency to try to organize information into these simplistic buckets because life is so beautiful and complex. It just isn't reducible to these things. We need to find ways, even in our own lives a reflecting, like right now, when you think about the things that have been most transformative or growth-oriented in your life, what metric would have revealed that?

Aryel Cianflone: Oh, my gosh.

Tristan Harris: And yet, so it's hard. And actually, Joe and I worked on a project in 2016. I was in Berlin, and we actually mapped out the components of transformative growth experiences. I actually, I haven't really talked about this, but, and I was just talking with another friend this last weekend about these transformative experiences that oftentimes it's things like meditation retreats or Burning Man or psychedelics or death of a family member or falling in love or having kids. These are the classic growth experiences.

Tristan Harris: But what they have in common is often holding up a mirror to letting you see yourself in a new way. If you spend seven days or 10 days in silence on meditation, that's a really big mirror that you're holding up to your own psyche. And so what makes for the growth and transformation is that.

Tristan Harris: I think technology isn't really showing us menus that reflect the kinds of choices that are these deep, meaningful things. It's kind of reinforcing this microcosm, this tiny subset of that we say it's like a magician who says, "Pick a card, any card." And then of course, you pick whatever card, and it's like now you picked a card, it's totally free, totally your choice, but you don't realize that the deck only had a limited set of options to begin with, and as a magician, you stack the deck.

Tristan Harris: Technology's kind of like that. It's like on a day-to-day basis, the choices that it offers are the ones that conveniently fit into a engineering mindset. So Yelp can map the restaurants. There's this mappable set of index restaurants. So when you think like, "What should I do?" you're shown a limited index of restaurants, as opposed to I could do cartwheels on the street. I could grab my friend, and we could start singing on a corner and put out our backpack and ask for money. There's just a billion different creative, even creation-oriented options, and we're actually training our minds to think more passively about picking existing items from a tech-limited searchable index menu, instead of the what could we create together?

Tristan Harris: If you said to your phone, "Hey, I want to go on a date," it just shows you faces you can swipe through on Tinder. Imagine it says, "Oh, you could go buy face paint from this store right over there and then walk to this bar and set up a face paint booth, and people love getting face paint." Just these totally crazy creative options are showing up nowhere on the menu of technology.

Tristan Harris: So part of this is like a challenge for getting people to ask, in a much bigger ecological sense, for the social fabric, for the things that people find most meaningful in their entire life. Like how could technology be helping us grow? Because the crisis of meaning is the thing that is actually showing up and having repercussions everywhere. That's why Jordan Peterson's so popular. It's like he's giving people an answer to something to do that is an answer to the crisis of meeting, even if it's as simple as for young white men, just make your bet. I mean people love that kind of simplistic patriarchal advice. That people need to feel that there's something to live for. As our menus get confined to more and more consumerist, blasé, vanilla options and we're just choosing between things to click on on screens, we're so far away from that other world.

Tristan Harris: And so my biggest fear is that we forget as human beings, how to recognize this richer, beautiful menu of choices, especially for the next generation who won't have known anything different, and they will think this is the menu. So all that's to say I think there's a different way that we can do this if we really have to ask ourselves and hold up this mirror and say, "What does it mean to be human?" And how do we make ourselves not super human, but extra human?

Aryel Cianflone: Thanks for listening today. If you want to continue the conversation, join us in the Slack group. If you aren't already a member, you can request an invite under the community tab on our website, Mixed-Methods.org. Follow us on Medium and Twitter to stay up to date with the latest UX research trends. Special thanks to Denny Fuller, our audio engineer and composer, and Laura Leavitt, our designer. See you next time.

  continue reading

31 에피소드

Artwork
icon공유
 
Manage episode 237367343 series 1383655
Aryel Cianflone에서 제공하는 콘텐츠입니다. 에피소드, 그래픽, 팟캐스트 설명을 포함한 모든 팟캐스트 콘텐츠는 Aryel Cianflone 또는 해당 팟캐스트 플랫폼 파트너가 직접 업로드하고 제공합니다. 누군가가 귀하의 허락 없이 귀하의 저작물을 사용하고 있다고 생각되는 경우 여기에 설명된 절차를 따르실 수 있습니다 https://ko.player.fm/legal.

Welcome to the third season of Mixed Methods!

This season will be full of people and conversations aimed at helping researchers think more deeply about their practice. In this episode, we’ll hear from Tristan Harris, a world renown design ethicist first at Google and now at the Center for Humane Technology, which he co-founded. The Center’s mission is to make technology more humane by starting a conversation about the ways in which tech often ends up unintentionally harming users. Tristan offers context and suggestions for how researchers can not only make products usable and useful, but also ethical.

Listen Now

Interview Transcript

Aryel Cianflone: Welcome to the third season of Mixed Methods. This season will be full of people and conversations aimed at helping you think more deeply about your research practice. I decided to spend the first half of the season exploring the future of research. Today, in part one, we'll hear from Tristan Harris, a world-renown design ethicist, first at Google and now at the Center for Humane Technology, which he co-founded in 2018. The center's mission is to make technology more humane by starting a conversation about the ways in which tech often ends up unintentionally harming users.

Aryel Cianflone: I can't think of a more important topic for us to consider as researchers. As we continue to advocate for our users and as our field continues to grow, I see researchers becoming a powerful force for not just making our products usable and useful, but also ethical.

Aryel Cianflone: Today's episode if brought to you by Dscout, the tool that enables teams to do in-context field work without leaving the office. Dscout connects you with people via their smartphones and allows you to handpick recruits, design diary studies, conduct live interviews and access the moments that matter. Learn more at Dscout.com/mm.

Aryel Cianflone: This is Aryel Cianflone, and you're listening to Mixed Methods. Today's episode: the future is ethical.

Aryel Cianflone: It's such a privilege to have the chance to talk with Tristan Harris. I've been following your work for so long. I thought just to get started, maybe you could briefly introduce yourself and your work at the Center for Humane Technology.

Tristan Harris: Yeah, sure. Thank you for having me. So oftentimes, when I tell the story of my background, it actually starts when I was a magician as a kid because magic is a very different way of looking at people. I mean I did because I was a shy kid looking for easier ways to connect with people and having an excuse to talk.

Tristan Harris: But I was always astonished when I look in the mirror and you're doing like a coin trick or something like that, and you're doing something incredibly simple like you think that you're passing the coin from one hand to the other, but you're not. It's just so, so, so simple. And I would watch how something that I was for sure thinking would not make it through the deception filters of the other person's brain, but it would work every time. And that led me to realize that we have this sort of overconfident view of how our minds make meaning and how they see the world and how cause and effect can be manipulated. And so that was my first entrée into really seeing that the mind works differently than we think. I did a magic show when I was I think in sixth grade for my elementary school.

Tristan Harris: And then, I studied at this lab later, jumping way ahead at Stanford called the Persuasive Technology Lab that is somewhat famous now maybe. Contrary to popular belief, it's not a lab where they diabolically train you in how to manipulate human nature. It's just a lab that studies everything that we know about the psychology of persuasion. It's like taking an advertisement class or taking a rhetoric class. I mean those are the conscious levels of language persuasion, but then really, you go up and down the stack, and you get clicker training for dogs, Pavlovian rewards, you get ...

Tristan Harris: So the founders of Instagram and I were both in that class, Mike Greger. There, I learned a lot more about the social psychological persuasion, that if you lined up all the tools of your persuasive weapons or arms, there would be nudging and subtle color changes and things like that one side, which are the very weak persuasive tools most like behavior change, nudging. But if you go deeper, you get social persuasion.

Tristan Harris: That's where social media gets really dangerous is that it taps into our social meaning-making and social validation and approval, social reciprocity. That's like email where I feel like I have to get back to all these people or I got tagged in a request, I have to answer that request. I think that that person invited me on LinkedIn, I have to get back to them. These are really persuasive in a whole new order of magnitude. It's like going from non-industrialized weapons to industrialized weapons.

Tristan Harris: So that was my foundation for thinking about how technology is working. It's persuading us at a deep level that's often invisible. In the same way that only magicians notice what other magicians are doing, seeing the world through a lens of persuasion is a very different way of looking at technology.

Aryel Cianflone: Partially, I feel like listening to your description of our kind of inherent human weaknesses, I feel like those are on the other side as well with technologists where we often overestimate individuals' behaviors or individuals' capabilities to kind of make a choice for themselves, even though we already have kind of stacked the deck against them.

Tristan Harris: Right. Yeah, it goes on all sides. I mean the fundamental thing that we're trying to do is say: human nature doesn't work in the way that we think it does. It's not that we're weak or that we're ... It's like weak little race or something like that. It's not that. It's just that, and this was in the beginning of our big April 23rd event that we held in San Francisco, and the focus of all of our work is this E.O. Wilson quote who's the father of sociobiology: that the fundamental problem of humanity, that we as a species have to figure out, is that we have paleolithic emotions, we have medieval institutions and we have God-like technology. So we have ancient paleolithic brains, we have 18th century governance, and we have nuclear weapons, Facebook, and narrative warfare.

Tristan Harris: So this ... And they operate at different clock rates. That's the important part about this quote is that your brain is fixed. It's like running Windows 95 and never getting an upgrade, right? So it happens to work at its base level in a certain way. And then the medieval institutions get an update every four years when you get some new people elected or something. But then our techy is ramping up at an exponential rate. And so whatever issue we're worried about now with social media are small compared to the speed and acceleration of things that are coming.

Tristan Harris: So when you zoom out and say, "Okay, this is really the problem we have to solve," which is our paleolithic brains, which were built for gathering berries and being with tribes in the savanna or whatever, are not built for climate change. Imagine, we used to make this joke, Aza and I, that looking at climate change as a human being is almost like a deer looking in the headlights. It's just too big. It's too big for our brains. And when you look at as technology designer and say, "Okay, with this design choice, I'm going to affect two billion people." Show me the part of your brain that was evolved to give you the capacity to imagine what would happen to two billion different people with 200 different languages and different cultures if I make the newsfeed work this way or that way. We just don't have that function in our brain.

Tristan Harris: And so ultimately, this is about recognizing how to realign technology with our own minds and limits. And the good news about this is that we're the only species that even has the capacity to study and understand our own limits, like that we can understand the ways that our minds get deceived. It's not really just about deception. When I say it this way, you might think, and often people think, and the TED people when the first TED talk came out, the first title they gave it was: the manipulative tricks that companies play on your brain, which makes it sounds like it's this kind of lightweight coin trick that maybe LinkedIn or Facebook or Snapchat are just like fooling you here and there. And this is just so far from the truth.

Tristan Harris: It's really more like this sort of civilizational mind influence, mental influence machine, which might sound too aggressive until we go into the details of why that's actually true. But think of it this way, that two billion people wake up in the morning and from the moment you wake up and your phone's buzzing at you from your alarm and you have to turn it off, to the 80 times during the day you check your phone, to the moment you set your alarm at night and then actually still end up checking social media anyway before you go to bed because we all do it, we have people completely jacked in.

Tristan Harris: I mean more than the size of Christianity in terms of number of people are jacked into Facebook. Facebook has 2.3 billion users. That's more than the size of Christianity as a, just a comparable psychological footprint. YouTube has more than 2 billion so. Well, we say it's more than the size of Christianity, more than the size of Islam. Nothing about the content of those religions. But people don't even have an empathy for what that really entails. Everything from "I'm late to my morning meetings." That thought doesn't just show up in your mind. It shows up because email and calendaring together make up for the psychological construct that gets pulled over your eyes like the matrix.

Tristan Harris: Once you see it that way, you have a very different view of what has to be done, which is to say, in 2019 with impending threats of climate change and inequality and other things, what ought to be a sense-making and choice-making layer that we should pull over our eyes, and what is our obligation as designers to do that?

Aryel Cianflone: I feel like there's this question for me of what is the individual's responsibility and what is the responsibility of the individuals that make up these organizations because these same biases exist in both, and I wonder how you think of that. Is it the individual's responsibility to turn off notifications or is that just, it's so small, it's just like a drop of water in the ocean of distractions that we have today, that I wonder how you think about kind of the responsibility on either side?

Tristan Harris: Yeah, well, I mean it's, the thing that creates ... I mean once we understood behavioral science and behavioral economics and we started realizing that people don't just choose their way through life because that would take a lot of energy and research and being informed and all that, but instead, most people operate by the default setting, so they don't even know that there's a different option, right? That's true at a very deep level, like a spiritual level even. You just sort of wake up in your identity instead of saying, "Oh, what would be a different item I could choose from the shelf space of my mental identity I want to try on today?"

Tristan Harris: But it's just true at a very deep level. The simplest example of this came from drivers license studies, right? If you give people the option to become an organ donor, and rather, if you look at this chart of there's like, I think it's like 100 countries that ... The majority of countries do not become organ donors, and there's a small number of countries, mostly in Scandinavia I think, that do become organ donors, and you ask like what's the difference? Are they just more generous? Are they better? Are they more charitable and other-centered, empathic, compassionate country with different culture, or it's actually just that those are the countries where in the drivers license registration form, the default setting is that you give up your organs in case you're in a car accident.

Tristan Harris: This just shows how much the world is really run by default. And so when it comes to technology and the fact that most people don't really question the technology that they're given. Their phone just shows up in their hand, and they hand it to their kid, and they hand YouTube to their kids, and they don't know how and why it's designed or if YouTube has ... I mean surely, it's not designed by evil people. But they're not going to change the default settings. And so one of the simplest standards that we can apply to technology is: what is the default setting that I would happily design and give to my own children to use?

Tristan Harris: One of the famous lines that we use is: the CEO of Lunchables food didn't give his own children Lunchables, right? It's a billion dollar a year product, food product line for kids, and he didn't let his own kids eat it. So it's a very simple moral standard to say: what would we want our own children to use frequently? And designing products for that standard would eliminate half of the problems we probably see in technology today.

Aryel Cianflone: And what do you think is the, kind of the challenges that are preventing us from doing this now or why haven't we just naturally made the default something like that?

Tristan Harris: Well, I think the real question you're asking is: why aren't we just designing what's best for people?

Aryel Cianflone: Yeah.

Tristan Harris: Why aren't we even thinking that way on a daily basis?

Tristan Harris: So the first thing is the advertising business model. When I say advertising business model, it's probably better to reframe it as the engagement business model because it's not the rectangle of the ad that's the problem. It's the incentive to say, "I cannot allow your brain to be free from your relationship with me. I, like a drug dealer, must create a obligatory loyalty relationship where I have to crawl down your brainstem and create that loyalty where you have to come back every day."

Tristan Harris: Now critics of our work would say, "Oh, come on. Aren't there times when people consciously want to come back to something every day?" In my worldview, everyone is manipulated all the time by everything. I think the point is that people don't realize the extent to which this manipulation exists and how long it's existed. In the attention economy, it used to be that we had to get your attention, so everyone's competing to nudge us and notifications and infinite scroll and things like they're just light tricks to keep you there. But that wasn't enough. It's much cheaper if, instead of trying to get your attention, I can get you addicted to getting attention from other people.

Tristan Harris: And so that means if I add, for example on Instagram, the number of followers you have. So now, you have a reason to come back every day to say, "Well, did I get some more followers?" More importantly, it creates the social dynamic where now, people are following each other all the time, they're always are new followers, and they're always curious, and you're always following other people. And so suddenly, you create a whole culture of narcissism where everyone is addicted to being an influencer and having attention from other people.

Tristan Harris: That's what this race for attention is about. That's why the engagement or advertising economy is so problematic. It's not because of evil designers wanting to diabolically manipulate your brainstem. It's not that at all. It's just, the banality I think makes it even more sinister, which is that it's good people who are caught in a game theoretic race to the bottom. We call it the race to the bottom of the brainstem, to light up more and more parts of your nervous system because if I light up more parts of your nervous system than the other guy, I'm going to get more of your attention.

Tristan Harris: The problem is that this creates a connected system of harms that we have to recognize as one whole system like climate change. Like imagine a world where climate change is happening, and people just don't see it. They only see polar bears. They're like, "Oh, my God. We lost all these bears. We lost another polar bear." That's what I see when people talk about screen time. Talking about all of these issues of the attention economy in terms of hours on screen is like talking about the number of polar bears with climate change, instead of talking about a billion climate refugees, permafrost melting, methane bombs, a whole bunch of dark stuff that's rally a serious risk.

Tristan Harris: So with technology, those interconnected harms are shortened attention spans, downgrading our attention spans, downgrading civility because outrage works better. Why does outrage work better and why is polarization happening? Because in the attention economy, short burst uses or your attention, so quick short brief attention bits are going to be better at getting your attention than demanding like do you want to sign up for this next two hour long chuck? That's harder. So that means that we're in this race to the shorter, briefer thing, which is why Twitter won that race to the bottom in terms of brevity.

Tristan Harris: But the problem is the world's increasing getting more complex. So to talk about anything that matters at any level of richness or productivity or constructiveness would take a long chunk of discussion to get to that complexity. But instead, you have the presidential debates where you say, "Iran, North Korea, and nuclear weapons. What is your answer? 30 seconds," right?

Tristan Harris: And so what that intrinsically creates is polarization because now, if I can only say simple things about complex topics, I'm only going to get some small percentage of people agreeing with the simple thing that I said because it won't map to the full complexity of the underlying territory. And so there's this whole interconnected system of harms that we call human downgrading. But just think of it like social climate change. It's an interconnected system so that shorter attention spans equals more polarization, more outrage, more fear, more isolation because it's better for the attention economy to have you by yourself on a screen addicted, spending more time with your esophagus compressed at 45 degrees, not breathing, and then more isolation means you're more vulnerable to conspiracy theories. Conspiracy theories works better in the attention economy anyway, but now that you're isolated, it works even better. Saying the media is lying, which YouTube by the way does, not intentionally, but it discovers that there's this pattern that saying the media is lying is actually really good for YouTube.

Tristan Harris: Think about the perfectly omniscient, brilliant AI of YouTube. Imagine some future 10 years down the road. It doesn't know why the phrase the media is lying is good for watch time, but if you say over and over again, the media is lying, intrinsically, that means people don't go to regular media channels, and they're more likely to spend more time on YouTube. So if you zoom way out, distrust in institutions is actually also good for these AIs that are calculating what's good for us or what's good at keeping our attention. Critical distinction.

Tristan Harris: So that's zooming out what's going on here is that we have ... The problem really emerges from a race to capture human attention, and because there's only so much, and it takes nine months to plug a new human being into it and grow the size of the attention economy is the joke, you have to get more aggressive, and it becomes, you have to frack for attention, and so it's better off having you split your attention into four different streams, so now you're now partially paying attention to your tablet, your TV and your email and your Facebook at the same time. So now I'm selling slimmer slices of your attention to more advertisers so I can quadruple the size of the attention economy, but it's kind of the subprime markets where I'm selling fake clicks, fake users and fake attention to advertisers, and so this just isn't good for anybody.

Tristan Harris: And because the attention economy is beneath all other economies, it's beneath all cultures, it's beneath the regular economy because where we spend our attention is what makes up our politics, our elections, our mental health. Even when you're not looking at the screen, your attention spans have still been affected by the attention. I don't know about you, but most people I know can't even get through books any more because we can barely focus for long periods of time.

Tristan Harris: So this is a huge problem that, again, like social climate change, it's the climate change of culture. But the good news is that ... The bad news is that like climate change, it can be catastrophic. The good news is, and this is why we're here, is that only about 1,000 people in Silicon Valley have to change what they're doing to prevent all this from happening.

Aryel Cianflone: Yeah, and I do feel like obviously, as you go through this, it's overwhelming, it's so problematic, it's scary. It's all of these things, but-

Tristan Harris: And notice that too, like that reaction, right? So there we are, are our brains like on the savanna, 20000 years ago, were they evolved to hear the sentences that I just-

Aryel Cianflone: Yeah, the negativity bias that we have.

Tristan Harris: Yeah. Right. Well [inaudible 00:18:56] negatively-wise, but also like we just laid out a whole complexity. Were our brains evolved to see huge amounts of complexity to say, "Yeah, let's go do something about that." Or are our brains evolved to say, "There's a whole bunch of complexity that's negative. I'm going to shy away and put my head in the sand, and go back to watching Netflix because that was way too scary."

Aryel Cianflone: Yeah, take the ostrich approach.

Tristan Harris: Yeah. Yeah, and so recognizing that, though, we're the only species that could notice that that's what happened in the face of such complexity and negativity and say okay, so what tends to be the kind of thing that makes people feel solidarity and give people, that converts that learned helplessness into agency. That's what we need to get really good at.

Aryel Cianflone: So my next question is so often when I hear you speaking, you really focus on designers and the design process as someone in design research. Researchers have got to be such an important part of this because we're the ones who bring humans into this whole equation and expose them to these products teams and these different type organizations, but yeah, what do we do?

Tristan Harris: Well, notice obviously, because of the business model, I mean even my story, right? I didn't finish ... I mean I guess the other chapters of my life where I landed at Google as a design ethicist, actually through them acquiring a company that I was a part of called Apture. I landed on the Gmail team. And even with Gmail, so I was a product manager on Gmail. Even there, Gmail's business model is not "Let's get people hooked to email, and they can't stop checking and maximizing screen ... " They don't do that at all, right? But there are some innocuous metrics that say, "Well, we do care about how engaged people are, and we certainly want more Gmail users." And the main reason for that is the thing ... How much more money do you think Google makes off of a signed-in user versus a non-signed-in user?

Aryel Cianflone: On Gmail?

Tristan Harris: Sorry. On search. Google search.

Aryel Cianflone: Oh, I've never thought about it. Thousands?

Tristan Harris: So the point is that a personalized search make more money off of a search than a non-personalized search. And guess what the number one reason why you're signed into Google for a personalize search might be?

Aryel Cianflone: Because of your Gmail.

Tristan Harris: Because of Gmail. So that's sets up a reason and a business reason, a business rationale for saying, "We need you to be jacked in." Now it doesn't mean we want ... There's no again addictive designers at Gmail trying to get people to do this, but let's say per your question, there we are. We're UX researchers on the Gmail team. We're like, "Okay. Let's minimize how much time people spend on this thing. Let's give people the most peace of mind. Let's broadcast or make transparent within an organization people's level of email overload compared to usual, so that when you type in someone's name and it auto-completes the pill of their contact into the address field, it shows a little color saying how overloaded they are in their average response time." So that would cool off some of the intensity of expecting immediate responses and all that stuff.

Tristan Harris: Well, there are a bunch of things that they could do, but ultimately, I think, especially if you're a designer of one of these social media engagement products, you don't really get that choice to minimize how much time. I mean imagine Facebook could just be about helping you spend time with your friends. It could just be that. That's it. It could help people who want to go on dates find salsa classes. It could help people who want to be at dinner tables with rich conversations be at the next dinner table. It could help people organize climate resilience, urban farms and gardens in their cities. It could help people do all these things. Just like totally empowering, strengthening the social fabric outside the screen. But what's the reason why Facebook doesn't do that?

Tristan Harris: Their profits comes from ...

Aryel Cianflone: Advertisements and time spent on, yeah.

Tristan Harris: Advertisements, which therefore, on the screen. So this is where it's so invisible. It's like if you talk to the Facebook policy team or Nick Clegg or Zuckerberg, and they'll say, "Well, we tested it without the ads, but people like the ads." The point is it's not the ads. It's that the incentive of keeping you on the screens at all is what is exacerbating this whole problem.

Tristan Harris: Now, it's important we also say that even if you're a designer or a UX researchers at Netflix, your business model is not keeping people on the screen. It's the subscription. You just have to pay that $8 a month, but they still maximize for watch time because that tends to be correlated with whether or not you'll cancel. Overall, we just don't want a system where each company and product are maximizing for engagement. Maximizing in general is a bad optimization.

Tristan Harris: And the thing that could give people hope here is that companies like Apple who make iOS and specifically, the Android team at Google, or Siempro, the Android alternative launcher and other alternative launchers for Android, are in a position to redesign the incentives of the attention economy. So imagine they just kick out of their app store everyone who's trying to maximize screen time saying that's just like a fossil fuel company. We don't want those in our attention economy. Those are the extractive attention economy, and we want to only include the ones whose business models are helping people get to the places that basically make life worth living.

Tristan Harris: Now that sounds normative or judgmental on my part, but only until you unravel all of the incentives and show how much every single apps design is basically self-dealing and extractive for their own interest. A quick way to get through this is just to ask people for say, LinkedIn or Twitter or whatever or Facebook, what are the most lasting like I would be proud of that on my death bed sort of choices that they tend to enable, right? Like maybe with LinkedIn, it's like, "I found that job that I was really looking to get," and with Twitter, it was, "I was in the same city during that conference, and I ended up with meeting up with drinks with my intellectual hero, and they were there." That's happened to me once, right? Or with Facebook, it's like who knows. You discover someone introduces you for someone on a date or something like this. And these things are great. And it would be great if the products were designed to just for wrapping around and empowering and strengthening those experiences, but that's just not what they're designed for.

Aryel Cianflone: Even when you are in field, you can't be with your participants, 24/7. But there's one thing that can be. There's smartphones. Dscout is a remote research platform levering just that, which saves you from missing the moments that matter. Set up a diary study and see your participants' daily lives in context. Use Dscout live and conduct interviews of a platform actually built for research. Bring your own participants onboard or handpick from their 100,000-person scout pool. To start connecting with more people more impactfully, head to Dcount.com/mm.

Aryel Cianflone: Well, and just on, I think my question, again, is like how do we get to a point because it feels, sitting here and listening to this, I feel like I'm also seeing this amazing future that could be that there is so much potential and so much promise in these different technologies that have been developed, but it also feels a little bit upside down of where we're at today, right? It's hard to imagine how we get there.

Tristan Harris: And how does that feel to notice that?

Aryel Cianflone: How does it feel? It feels uncomfortable a little bit. It feels sad. It just feels like I wish that we could be better, but humans so prone to short-term thinking, right? And all the humans that make up these organizations are the same way, right? They all have profit targets to hit. They all have stakeholders. They all have OKRs and different things that they want to hit. How do you get people to so fundamentally change?

Tristan Harris: Well, you said something interesting there, which is that humans have short-term thinking. It's true. We, in our own nervous system, are optimized for short-term immediate rewards. But what you really said is that the incentives are for that, and when you especially have publicly-traded companies, the pressures for short-term growth et cetera, make it impossible to makes the kinds of deep structural changes we're talking about. I feel uncomfortable raising this conversation sometimes because it's a possible world. There's nothing that is technically infeasible about what we're describing. And by the way, the more you lean into it, it's kind of amazing. It's a world where you can trust-fall into technology, and its sole purpose and design is just to be-

Aryel Cianflone: Is to help you.

Tristan Harris: ... helping people. They're literally like bumping each other's elbows out of the way being like, "No, no, no. I have an even better idea of about how to help Aryel." I want people to really imagine what that would feel like because that's at least the north star I think we could be aiming for.

Tristan Harris: Now between now and then, a lot of things have to happen, and the uncomfortable thing is the fact that we can't just snap our fingers and switch to doing that other thing. But if corporate boards were pressured by their shareholders, which happens, to say, "We have to get off this business model because we're seeing it as a long-term of source of investor shareholder risk," which it is by the way, because basically, all these companies, if they're incentivized by attention and data, it creates the long-term risk for privacy scandals and for people being aware that these companies interests are not aligned with ours and the public perception issues. If that starts to spread, as it's already doing with Facebook and Cambridge Analytica and all these kinds of things that are tying themselves into knots, it's only a matter of time before any company that's in that business model is going to have a problem.

Tristan Harris: YouTube with maximizing watch time is going to have a problem. Twitter, Facebook, et cetera, Snapchat. And the ones ... So you could imagine a world where through corporate board resolutions, through shareholder activism, through policymaking, Paul Romer, the Nobel Prize-winning economist, proposed recently a ... I think he called it a progressive advertising tax, where you basically tax companies for having the advertising-based business model. The whole point is just like fossil fuels transitioning to regenerative or renewable energy, you want to increase the cost of the thing that's causing the harm and lower the cost of the thing that we all want to live in, that will not destroy civilization.

Tristan Harris: That is the game that has to get played. I think the thing that people on the inside of these companies needs to understand is that you all have a voice and a role in that. I mean I was inside of Google, and I didn't know how these things would ever change. I mean I literally ... I saw some of these things, and I made this famous presentation sort of in ethics [inaudible 00:29:44] to me becoming an design ethicist, it basically was a call to arms to the whole company saying we have ea moral responsibility in managing a billion people's attention and especially this race to the bottom for attention.

Tristan Harris: I was super nervous in making it and then when to release it. I worked on it for about a month. It was like a 136-page slide deck without any bullet points or anything, just big images. It's actually available now on minimizeddestruction.com. Someone had found it and posted it there. So you can see it. But it was an artifact that I sent around to the whole company. Actually, it's not true. I sent it to just 10 people, and then it ended up spreading virally.

Tristan Harris: I really want to say I was incredibly nervous about what would happen, and I was nervous about the quality of my ideas because I thought I could be wrong, surely, if this was true, then-

Aryel Cianflone: Other people would've brought it up.

Tristan Harris: ... other people would've brought it up. This is when you realize there's no adults in the room. The obvious thing here is it is a kind of capitalism that's sort of limbic capitalism that the paper clip maximizer is pointed at the brainstems of people, and that's why no one talks about it because it just goes against incentives. You know the famous Upton Sinclair quote: you can't get people to question something that their salary depends on them not seeing.

Tristan Harris: Anyway. This thing spread, and lo and behold, to my surprise, it spread to something like 10,000 people at Google. It became on a internal thing called meme gen, which is the culture tracker. People post memes and vote them up. It was the number one most-voted meme, like saying we want to talk about this at TGIF.

Tristan Harris: It led to me becoming a design ethicist, and then it led to, again, me trying to ... I was hosting lunches and meetups and conversations at the company and organizing round table discussions. Honestly, it didn't go further than that because the business model was still entrenched. It wasn't that anybody said, "Hey, we still need to make money. We can't do that." It was just that's not a priority. Sure, we can make people use Android a little bit less or check your phone a little bit less, but it's just not our priority.

Tristan Harris: And I had to leave to create that outside pressure for what we now see is an unprecedented level of change, and not just because of our work. There's been so many people that have been out there. Roger McNamee, Sandy Parakilas, and [Zeynep 00:32:00] and all these great organizations and people who've been raising the alarms, but it's led to now Apple, Google and Facebook adopting in our case, Time Well Spent for their ... The reason everybody has the chart or graph of where their time goes on their phone is because of this pressure.

Tristan Harris: The point isn't that it just happens from the outside. It's actually because it creates a conversation. So you show up on Monday at a product design meeting at Twitter or Facebook, and people say, "Hey, have you heard of the attention economy and the race to the bottom of the brainstem and what are we doing about this? What about Time Well Spent? What this TED talk."

Tristan Harris: I think that if everybody can raise this conversation and you hear it three times in one day, it becomes pressure because if you think about it, where does pressure exist? I mean memetically like where's the pressure for Time Well spent? Where are the atoms? Does like a big pickup truck show up and starts pushing on the walls and pressure on the buildings?

Tristan Harris: The pressure exists through people repeating something and feeling like it's really hard to go to work if we're not doing this other thing that I know is possible. It turns out to be a lot more expensive for companies to replace demoralized employees than it is to start doing more of the right thing because people need to feel incentivized to do something different. I mean they started launching these meaningful social interactions metrics at Facebook, which are basically the Time Well Spent metrics. Like how are we measuring something other than just engagement?

Tristan Harris: And if there's enough pressure, everybody who's listening to this podcast can actually raise this conversation themselves. And the reason why we focus so much on language is that we have to have language that describes this problem statement, this interconnected set of harms that we call human downgrading, you can call it social climate change in technology, so that we're not just solving addiction. We're not just solving screen time, which is just not the right way to think about it. We don't want to downgrade our civilization's capacity at the time we most need that capacity.

Aryel Cianflone: Well, and it sounds like you have seen some large successes in terms of Apple and Google and Facebook implementing these Time Well Spent features. I wonder if you've heard stories from individual designers who have heard your message and have gone out and done something different with their product teams, or I'm really thinking of the UX researcher who listens to this, and then they go into work, and they want to do something about it. what is the best way to start that conversation or find success in this?

Tristan Harris: I think also, to answer your question, I think people get blinded by wanting to do the absolute good when just asking what's the smallest step I could take in this direction tomorrow? What a small step? What's the smallest way that the product can be nudged and designed towards these outcomes? What's the metric that I can introduce? What's the way I could have this conversation? What's the set of design principles that might eliminate the problem?

Tristan Harris: I don't have all the answers. I mean we really want to encourage all of the people listening to this to think for themselves about in what way would you nudge the company that you're working for in this direction?

Tristan Harris: But to do so forcefully, I mean I think this change can happen only with 1,000 people together realizing that no one actually wants this. I mean that's the thing. It's not like when you see this, anybody's excited about the mass downgrading of attention spans, the mass dumbification, stupification of society. And if that's not enough to motivate you, by the way, not to make it the dark side, but China will choose not to downgrade their population. So it's a competition between the West and China about which country will downgrade their population the least, given these dynamics.

Tristan Harris: I think that that can motivate all of us much like climate change can at saying: what are we going to do to upgrade our capacity? That's something that actually I regret, by the way. We haven't yet sufficiently introduced the positive frame for the opposite to human downgrading. With Time Well Spent, we saw that people were repeating this positive phrase. I mean the beauty of that, as an example to answer your question, we had let's see [inaudible 00:36:09] Class Pass and Pinterest and all these companies walking to work and saying, "Hey, guys. We want to be part of Time Well Spent, and we want to ask our engineers and designers what's the metric that you're going to invent, what should that be? Let's have a conversation about what are the design features that we're going to do differently? What are the thing people find most valuable and lasting and time well spent for their lives?" It was a positive message and frame that created a lot of interest and implementation.

Tristan Harris: We don't yet have that frame and phrase for the opposite of human downgrading. We hesitate to call it upgrading humanity because it sounds techno-utopian, which is the same mess that got us here. We've been talking about upgrading humans and human capacity for a long time, and it got us exactly to this place where things have been going pretty bad. So whatever moral framework we need, it's certainly ... has a ... It's covering a missing blind spot that we didn't take into account before.

Aryel Cianflone: Well, and Tristan, I want to go back to something that we were talking about at the beginning, which is kind of this difference between individual responsibility and the responsibility of individuals within the organization because I feel like it's such an important point, especially for researchers who are interacting with both of these groups for so much of their time. So is the idea that ... I guess thinking about somebody going into a research session, right, with a user, what is the question that you can ask that individual to understand or to help your product team understand and see that this is time well spent or this isn't time well spent? Just kind of even spit balling about ideas on that because I feel like researchers have this amazing opportunity to make these things really apparent to large groups of people within their organization.

Tristan Harris: Yeah, Jo Edelman who is one of my collaborators and he was the CTO of Couch Surfing who invented, he co-invented this phrase Time Well Spent and a lot of the design methodology from his work at Couch Surfing, where he pioneered a bunch of surveys asking people what would make their time spent with someone hosted at Couch Surfing really meaningful? They actually did a retrospective survey. So six months later, they would ask you after you stayed with someone in Paris for four days, "How was that?"

Tristan Harris: But then they would also do this six month later retrospective. They would bring back that person and say, "How do you feel about it now?" They used that long-term six-month waiting signal to rank search results. So if you typed in Paris, and you were a 20-something-year-old woman from San Francisco, you would see often, if you looked at just your click patterns, you'd probably end up clicking on people who were in their 20s in Paris. That would be your default set of choices.

Tristan Harris: But because Couch Surfing was ranking by what people said in the long run, like six months later was the most valuable, they would end up finding these patterns like Joe's example was something like a 50-year-old Iraqi immigrant into Paris was like the person that everybody loved staying with. It was this Iraqi guy who was just super jovial and really heartwarming and charming and giving and loved cooking Iraqi food and taking you out to the local pub in Paris. That's awesome. That's the thing we're trying to figure out here.

Tristan Harris: I think as engineers, we have to watch out for the tendency to try to organize information into these simplistic buckets because life is so beautiful and complex. It just isn't reducible to these things. We need to find ways, even in our own lives a reflecting, like right now, when you think about the things that have been most transformative or growth-oriented in your life, what metric would have revealed that?

Aryel Cianflone: Oh, my gosh.

Tristan Harris: And yet, so it's hard. And actually, Joe and I worked on a project in 2016. I was in Berlin, and we actually mapped out the components of transformative growth experiences. I actually, I haven't really talked about this, but, and I was just talking with another friend this last weekend about these transformative experiences that oftentimes it's things like meditation retreats or Burning Man or psychedelics or death of a family member or falling in love or having kids. These are the classic growth experiences.

Tristan Harris: But what they have in common is often holding up a mirror to letting you see yourself in a new way. If you spend seven days or 10 days in silence on meditation, that's a really big mirror that you're holding up to your own psyche. And so what makes for the growth and transformation is that.

Tristan Harris: I think technology isn't really showing us menus that reflect the kinds of choices that are these deep, meaningful things. It's kind of reinforcing this microcosm, this tiny subset of that we say it's like a magician who says, "Pick a card, any card." And then of course, you pick whatever card, and it's like now you picked a card, it's totally free, totally your choice, but you don't realize that the deck only had a limited set of options to begin with, and as a magician, you stack the deck.

Tristan Harris: Technology's kind of like that. It's like on a day-to-day basis, the choices that it offers are the ones that conveniently fit into a engineering mindset. So Yelp can map the restaurants. There's this mappable set of index restaurants. So when you think like, "What should I do?" you're shown a limited index of restaurants, as opposed to I could do cartwheels on the street. I could grab my friend, and we could start singing on a corner and put out our backpack and ask for money. There's just a billion different creative, even creation-oriented options, and we're actually training our minds to think more passively about picking existing items from a tech-limited searchable index menu, instead of the what could we create together?

Tristan Harris: If you said to your phone, "Hey, I want to go on a date," it just shows you faces you can swipe through on Tinder. Imagine it says, "Oh, you could go buy face paint from this store right over there and then walk to this bar and set up a face paint booth, and people love getting face paint." Just these totally crazy creative options are showing up nowhere on the menu of technology.

Tristan Harris: So part of this is like a challenge for getting people to ask, in a much bigger ecological sense, for the social fabric, for the things that people find most meaningful in their entire life. Like how could technology be helping us grow? Because the crisis of meaning is the thing that is actually showing up and having repercussions everywhere. That's why Jordan Peterson's so popular. It's like he's giving people an answer to something to do that is an answer to the crisis of meeting, even if it's as simple as for young white men, just make your bet. I mean people love that kind of simplistic patriarchal advice. That people need to feel that there's something to live for. As our menus get confined to more and more consumerist, blasé, vanilla options and we're just choosing between things to click on on screens, we're so far away from that other world.

Tristan Harris: And so my biggest fear is that we forget as human beings, how to recognize this richer, beautiful menu of choices, especially for the next generation who won't have known anything different, and they will think this is the menu. So all that's to say I think there's a different way that we can do this if we really have to ask ourselves and hold up this mirror and say, "What does it mean to be human?" And how do we make ourselves not super human, but extra human?

Aryel Cianflone: Thanks for listening today. If you want to continue the conversation, join us in the Slack group. If you aren't already a member, you can request an invite under the community tab on our website, Mixed-Methods.org. Follow us on Medium and Twitter to stay up to date with the latest UX research trends. Special thanks to Denny Fuller, our audio engineer and composer, and Laura Leavitt, our designer. See you next time.

  continue reading

31 에피소드

Tutti gli episodi

×
 
Loading …

플레이어 FM에 오신것을 환영합니다!

플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.

 

빠른 참조 가이드