Skip to content
Home » Technology Ethicist Tristan Harris on The Diary Of A CEO Podcast (Transcript)

Technology Ethicist Tristan Harris on The Diary Of A CEO Podcast (Transcript)

Here is the full transcript of former Google design ethicist Tristan Harris’ interview on The Diary Of A CEO Podcast with host Steven Bartlett, on “We Have 2 Years Before Everything Changes. We Need To Start Protesting!”, November 27, 2025.

Ex–Google design ethicist Tristan Harris joins Steven Bartlett to deliver his starkest warning yet: the global race to AGI is being driven by a tiny group of CEOs who privately accept serious extinction risks while publicly selling only abundance and hype. From AI systems that can already find software vulnerabilities and blackmail humans, to the prospect of mass job loss and “runaway” AI research by 2027, Harris explains why he believes the next two years are critical—and why citizens must start pushing back now if they want a humane future.

Who Is Tristan Harris?

STEVEN BARTLETT: Tristan, I think my first question, and maybe the most important question is we’re going to talk about artificial intelligence and technology broadly today, but who are you in relation to this subject matter?

TRISTAN HARRIS: So I did a program at Stanford called the Mayfield Fellows Program that took engineering students and then taught them entrepreneurship. I, as a computer scientist, didn’t know anything about entrepreneurship, but they pair you up with venture capitalists, they give you mentorship, and there’s a lot of powerful alumni who were part of that program. The co-founder of Asana, the co-founders of Instagram, were both part of that program.

And that put us in kind of a cohort of people who were basically ending up at the center of what was going to colonize the whole world’s psychological environment, which was the social media situation.

As part of that, I started my own tech company called Apture. We basically made this tiny widget that would help people find more contextual information without leaving the website they were on. It was a really cool product that was about deepening people’s understanding. And I got into the tech industry because I thought that technology could be a force for good in the world. That’s why I started my company.

And then I kind of realized through that experience that at the end of the day, these news publishers who used our product, they only cared about one thing, which is: is this increasing the amount of time and eyeballs and attention on our website? Because eyeballs meant more revenue.

And I was in sort of this conflict of, I think I’m doing this to help the world, but really I’m measured by this metric of what keeps people’s attention. That’s the only thing that I’m measured by.

The Instagram Story and Perverse Incentives

And I saw that conflict play out among my friends who started Instagram, because they got into it because they wanted people to share little bite-sized moments of your life. Here’s a photo of my bike ride down to the bakery in San Francisco. That’s what Kevin Systrom used to post when he was just starting it. I was probably one of the first hundred users of the app.

And later you see how these sort of simple products that had a simple good, positive intention got sort of sucked into these perverse incentives.

And so Google acquired my company called Apture. I landed there and I joined the Gmail team. And I’m with these engineers who are designing the email interface that people spend hours a day. And then one day one of the engineers comes over and he says, “Well, why don’t we make it buzz your phone every time you get an email?”

And he just asked the question nonchalantly like it wasn’t a big deal. And in my experience, I was like, oh my God, you’re about to change billions of people’s psychological experiences with their families, with their friends, at dinner, with their date night on romantic relationships, where suddenly people’s phones are going to be buzzing, showing notifications of their email and you’re just asking this question as if it’s like a throwaway question.

The Slide Deck That Changed Everything

And I became concerned. I see you have a slide deck there. I do, yeah. About basically how Google and Apple and social media companies were hosting this psychological environment that was going to corrupt and fracture the global human attention of humanity.

And I basically said I needed to make a slide deck. It’s 130-something pages, slide deck. That basically was a message to the whole company at Google saying we have to be very careful and we have a moral responsibility in how we shape the global attentions of humanity.

STEVEN BARTLETT: The slide deck I have printed off, which my research team found is called “A Call to Minimize Distraction and Respect Users Attention by a Concerned PM and Entrepreneur.” PM meaning Project Manager.

TRISTAN HARRIS: Project Manager, yeah.

STEVEN BARTLETT: How was that received at Google?

TRISTAN HARRIS: I was very nervous actually, because I felt like I wasn’t coming from some place where I wanted to stick it to them or be controversial. I just felt like there was this conversation that wasn’t happening and I sent it to about 50 people that were friends of mine just for feedback.

And when I came to work the next day, there was 150 in the top right on Google Slides. It shows you the number of simultaneous viewers. And it had 130-something simultaneous viewers. And later that day it was like 500 simultaneous viewers.

And so obviously it had been spreading virally throughout the whole company. And people from all around the company emailed me saying, “This is a massive problem. I totally agree, we have to do something.”

Becoming Google’s Design Ethicist

And so instead of getting fired, I was invited and basically stayed to become a design ethicist studying how do you design in an ethical way and how do you design for the collective attention spans and information flows of humanity in a way that does not cause all these problems?

Because what was sort of obvious to me then, and that was in 2013, is that if the incentive is to maximize eyeballs and attention and engagement, then you’re incentivizing a more addicted, distracted, lonely, polarized, sexualized breakdown of shared reality society.

Because all of those outcomes are success cases of maximizing for engagement for an individual human on a screen.