Skip to content
Home » Transcript of The AI Revolution Is Underhyped: Eric Schmidt

Transcript of The AI Revolution Is Underhyped: Eric Schmidt

The following is the full transcript of a conversation between former Google CEO and chairman Eric Schmidt and technologist Bilawal Sidhu at TED2025 on April 11, 2025.

Listen to the audio version here:

The Moment AI Surpassed Human Creativity

BILAWAL SIDHU: Eric Schmidt, thank you for joining us. Let’s go back. You said the arrival of non-human intelligence is a very big deal. And this photo taken in 2016 feels like one of those quiet moments where the earth shifted beneath us, but not everyone noticed. What did you see back then that the rest of us might have missed?

ERIC SCHMIDT: In 2016, we didn’t understand what was now going to happen, but we understood that these algorithms were new and powerful. What happened in this particular set of games was in roughly the second game, there was a new move invented by AI in a game that had been around for 2,500 years that no one had ever seen. Technically, the way this occurred was that the system of AlphaGo was essentially organized to always maintain a greater than 50% chance of winning. And so it calculated correctly this move, which was this great mystery among all of the GO players, who are obviously insanely brilliant mathematical and intuitive players.

The question that Henry, Craig Mundie, and I started to discuss is, what does this mean? How is it that our computers could come up with something that humans had never thought about? I mean, this is a game played by billions of people. And that began the process that led to two books. And I think, frankly, is the point at which the revolution really started.

Why AI Is Actually Under-hyped

BILAWAL SIDHU: If you fast forward to today, it seems that all anyone can talk about is AI, especially here at TED. But you’ve taken a contrarian stance. You actually think AI is under-hyped. Why is that?

ERIC SCHMIDT: And I’ll tell you why. Most of you think of AI as, I’ll just use the general term as ChatGPT. For most of you, ChatGPT was the moment where you said, oh my God, this thing writes and it makes mistakes, but it’s so brilliantly verbal. Right. That was certainly my reaction. Most people that I knew did that.

BILAWAL SIDHU: It was visceral.

ERIC SCHMIDT: Yeah. This was two years ago. Since then, the gains in what is called reinforcement learning, which is what AlphaGo helped invent and so forth, allow us to do planning. And a good example is look at OpenAI03 or Deepseak R1, and you can see how it goes forward and back, forward and back, forward and back. It’s extraordinary. In my case, I bought a rocket company because it was like, interesting.

BILAWAL SIDHU: And I know as one does, as.

ERIC SCHMIDT: One does, and it’s an area that I’m not an expert in and I want to be an expert. So I’m using deep research. And these Systems are spending 15 minutes writing these deep papers. It’s true for most of them. Do you have any Idea how much computation 15 minutes of these supercomputers is. It’s extraordinary.

So you’re seeing the arrival, the shift from language to language that then you had language to sequence, which is how biology is done. Now you’re doing essentially planning and strategy. The eventual state of this is the computers running all business processes. So you have an agent to do this, an agent to do this, an agent to do this, an agent to this. And you concatenate them together and they speak language among each other. They typically speak English language.

The Resource Challenges of AI

BILAWAL SIDHU: I mean, speaking of just the sheer compute requirements of these systems, let’s talk about scale briefly. You know, I kind of think of these AI systems as hungry hungry hippos. They seemingly soak up all the data and compute that we throw at them. They’ve already digested all the tokens on the public Internet, and it seems we can’t build data centers fast enough. What do you think the real limits are and how do we get ahead of them before they start throttling AI progress?

ERIC SCHMIDT: So there’s a real limit in energy. Give an example. There’s one calculation, and I testified on this this week in Congress, that we need another 90 gigawatts of power in America. My answer, by the way, is think Canada, right? Nice people, full of hydroelectric power. But that’s apparently not the political mood right now. Sorry.

So 90 gigawatts is 90 nuclear power plants in America. Not happening. We’re building zero. How are we going to get all that power? This is a major, major national issue. You can use the Arab world, which is busy building 5 to 10 gigawatts of data centers. India is considering a 10 gigawatt data center. To understand how big gigawatts are is think cities per data center. That’s how much power these things need.

And people look at it and they say, well, there’s lots of algorithmic improvements and you will need less power. There’s an old rule I’m old enough to remember, right. Grove giveth, gates taketh away. Okay? The hardware just gets faster and faster. The physicists are amazing. Just incredible what they’ve been able to do. And us software people, we just use it and use it and use it.

And when you look at planning, at least in today’s algorithms, it’s back and forth and try this and, and just watch it yourself. There are estimates, and you know this from, from Horowitz reports. It’s been well studied that there’s an increase in at least a factor of 100, maybe a factor of 1000 in computation required just to do the kind of planning. The technology goes from essentially deep learning to reinforcement learning, to something called test time, compute, where not only are you doing planning, but you’re also learning while you’re doing planning. That is the, if you will, the zenith or what have you of computation needs.

That’s problem number one, electricity and hardware.