Skip to content
Home » Transcript of Can AI Match the Human Brain? – Surya Ganguli

Transcript of Can AI Match the Human Brain? – Surya Ganguli

Read the full transcript of neuroscientist and Stanford professor Surya Ganguli ‘s talk titled “Can AI Match the Human Brain?” at TEDAI San Francisco on October 22, 2024.

Listen to the audio version here:

TRANSCRIPT:

Understanding the Gap Between AI and Human Intelligence

SURYA GANGULI: So what the heck happened in the field of AI in the last decade? It’s like a strange new type of intelligence appeared on our planet, but it’s not like human intelligence. It has remarkable capabilities, but it also makes egregious errors that we never make. And it doesn’t yet do the deep logical reasoning that we can do. It has a very mysterious surface of both capabilities and fragilities, and we understand almost nothing about how it works.

I would like a deeper scientific understanding of intelligence. But to understand AI, it’s useful to place it in the historical context of biological intelligence.

The story of human intelligence might as well have started with this little critter. It’s the last common ancestor of all vertebrates. We are all descended from it. It lived about 500 million years ago. Then evolution went on to build the brain, which in turn, in the space of 500 years, from Newton to Einstein, developed the deep math and physics required to understand the universe from quarks to cosmology. And it did this all without consulting ChatGPT.

And then, of course, there’s the advances of the last decade. To really understand what just happened in AI, we need to combine physics, math, neuroscience, psychology, computer science, and more to develop a new science of intelligence. This science of intelligence can simultaneously help us understand biological intelligence and create better artificial intelligence. And we need the science now, because the engineering of intelligence has vastly outstripped our ability to understand it.

I want to take you on a tour of our work in the science of intelligence that addresses five critical areas in which AI can improve:

  • Data efficiency
  • Energy efficiency
  • Going beyond evolution
  • Explainability
  • Melding minds and machines

Let’s address these critical gaps one by one.

Data Efficiency: AI’s Massive Appetite

AI is vastly more data-hungry than humans. For example, we train our language models on order of one trillion words now. Well, how many words do we get? Just 100 million. It’s that tiny little red dot at the center. You might not be able to see it. It would take us 24,000 years to read the rest of the one trillion words.

Okay, now you might say that’s unfair. Sure, AI read for 24,000 human equivalent years, but humans got 500 million years of vertebrate brain evolution. But there’s a catch. Your entire legacy of evolution is given to you through your DNA, and your DNA is only about 700 megabytes, or equivalently 600 million words. So the combined information we get from learning and evolution is minuscule compared to what AI gets. You are all incredibly efficient learning machines.

So how do we bridge the gap between AI and humans?

We started to tackle this problem by revisiting the famous scaling laws. Here’s an example of a scaling law where error falls off of the power law with the amount of training data. These scaling laws have captured the imagination of industry and motivated significant societal investments in energy, compute, and data collection.

But there’s a problem. The exponents of these scaling laws are small. So to reduce the error by a little bit, you might need to 10x your amount of training data. This is unsustainable in the long run, and even if it leads to improvements in the short run, there must be a better way.

We developed a theory that explains why these scaling laws are so bad. The basic idea is that large random data sets are incredibly redundant. If you already have billions of data points, the next data point doesn’t tell you much that’s new. But what if you could create a non-redundant data set where each data point is chosen carefully to tell you something new compared to all the other data points?

We developed theory and algorithms to do just this. We theoretically predicted and experimentally verified that we could bend these bad power laws down to much better exponentials where adding a few more data points could reduce your error rather than 10x-ing the amount of data.

ALSO READ:  What Can We Do About 'Evil AI'? - Staffan Truvé (Transcript)

So what theory did we use to get this result? We used ideas from statistical physics, and these are the equations. Now for the rest of this entire talk, I’m going to go through these equations one by one. You think I’m joking? And explain them to you.

Okay, you’re right. I’m joking. I’m not that mean, but you should have seen the faces of the TED organizers when I said I was going to do that.

Reimagining Machine Learning

Let’s zoom out a little bit and think more generally about what it takes to make AI less data-hungry. Imagine if we trained our kids the same way we pre-train our large language models, by next-word prediction. So I’d give my kid a random chunk of the internet and say, by the way, this is the next word. I’d give them another random chunk of the internet and say, yeah, this is the next word. If that’s all we did, it would take our kids 24,000 years to learn anything useful.

But we do so much more than that. For example, when I teach my son math, I teach him the algorithm required to solve the problem. Then he can immediately solve new problems and generalize using far less training data than any AI system would do. I don’t just throw millions of math problems at him.

To really make AI more data-efficient, we have to go far beyond our current training algorithms and turn machine learning into a new science of machine teaching. And neuroscience, psychology, and math can really help here.

Energy Efficiency: The 20-Watt Brain vs. AI

Let’s go on to the next big gap, energy efficiency. Our brains are incredibly efficient. We only consume 20 watts of power. For reference, our old light bulbs were 100 watts.