Skip to content
Home » TRANSCRIPT: Chip War 2.0: The Global Battle for Semiconductor Supremacy: Chris Miller

TRANSCRIPT: Chip War 2.0: The Global Battle for Semiconductor Supremacy: Chris Miller

Read the full transcript of author Chris Miller’s lecture titled “Chip War 2.0: The Global Battle for Semiconductor Supremacy” at World Knowledge Forum 2024.

Listen to the audio version here:

TRANSCRIPT:

CHRIS MILLER: Well, welcome everyone. What I’d like to do over the next 40 minutes is explain to you why it is that semiconductors today are at the core, not just of advances in technology, which we’re used to, but also at the center of global debate in economics and in international politics. And to argue that you can’t understand the world around you without putting analysis of semiconductors at the center.

Behind all artificial intelligence systems today is computing power, more and more compute used to train the most successful AI systems. And this is a trend that has lasted not just for years, but for decades. Most of the advances in artificial intelligence have stemmed from using more data to train artificial intelligence systems, which in turn requires more computing power to use in the training of that data.

This chart shows you the most advanced AI systems of their day over the last 70 years. And what you find is that there’s been a consistent, extraordinarily rapid rate of increase in the amount of data used for training AI systems.

The amount of data used for AI training doubles on a regular basis. And what’s more, over the past 15 years, the rate of increase has only accelerated. This is what AI researchers, people at AI labs like OpenAI or Anthropic, refer to as the scaling laws. The intuition is that if you want a better AI system, you need a bigger AI system, a system trained on larger and larger quantities of data, which is why the AI systems that today undergird chatbots like ChatGPT are trained on almost all of the text that exists on the internet.

And it turns out we’re just getting started. We’re finding more and more sources of data to use in training AI systems. And most of the optimism about improvements in AI over the coming decades stem from the belief that if we find yet more data and train yet bigger AI systems, we’ll have better artificial intelligence as a result. And if you want bigger AI systems trained on more and more data, you need better semiconductors.

Because it’s in vast data centers full of the most advanced servers with the most advanced chips inside of them that all AI systems are trained. This is why the world’s biggest tech companies are more focused than ever before on getting access to the cutting edge AI chips that are used in training these extraordinarily capable systems. And it isn’t only tech companies that are focused on computing power as a core metric of success in the world of artificial intelligence. Governments are too.

China, for example, has articulated a goal of increasing the computing power in China by 50% over the coming years. Countries from India to the United Kingdom have set out similar goals. Because they believe, just like all the world’s large tech companies believe, that better AI systems require more computing power. And therefore, access to computing power, access to the cutting edge chips that enable the training of big AI systems will be critical not just for technological advances, but for their economic and their political future.

Distribution of Computing Power

If you look today at the distribution of computing power around the world, what you find is that there are two countries far ahead of the rest. Here’s a chart put together by the Blair Institute of this server installation base by country. How many servers every country has installed. And what you find is that there are two leaders by far.

First, the United States, far out in front of the pack. Second, China, clearly in second place, but also clearly ahead of the rest of the competitors. And it’s no surprise, looking at charts like this, that there are also two countries that are setting the terms of the race to build AI systems. These are the countries that have access to the extraordinary volumes of computing power that are needed to train AI systems.

And these are the countries that are competing right now to gain access to the cutting edge chips that advanced AI training requires. And it’s data centers like this that they’re focused above all on building. Data centers, vast warehouses full of thousands and thousands of chips and servers into which data is surged as AI systems are trained. Data centers like this are rewriting the wiring of the global economy.

They’re reshaping investment in many different countries. They’re attracting billions and billions of dollars from some of the world’s largest investors who are betting that access to cutting edge data centers will be economically critical in the future because data centers like this are not only used for training AI systems, they’re also used for deploying them. That’s why AI data centers are being built not only in the United States and in China, but in many other regions of the world too. In Malaysia, the center of the data center boom in Southeast Asia and the Middle East, where Saudi Arabia and the United Arab Emirates are trying to become AI hubs in the developing world.

Many governments are focused on data centers like this as a tool of power in the modern world. And it’s forcing changes in how big tech companies operate. Because if you want a cutting edge data center like this, you need extraordinary volumes of energy to power it. More energy than data centers have ever consumed before.

Tech Companies as Energy Investors

And this is why some of the world’s biggest technology companies are now also becoming some of the world’s biggest energy investors. Whether it’s Microsoft or Meta, Alphabet or Amazon, they’re spending billions, and in some cases, tens of billions of dollars, building and buying power directly. Because in certain regions, like in the United States, we’re seeing the first increase in demand for electricity that we’ve seen in decades.