On top of that are all of these artificial intelligence networks that I talked about. There’s the autopilot AIs, perception of the world, reasoning about where you are, reasoning about where everybody else is, driving the car, continuously mapping the car and exchanging that information with the HD maps in the cloud, make sure that all the data is coherent and things are changed if necessary, the copilot, deep neural net so that there’s an AI that’s watching out for you all the time. It might even in the near future say something like ‘Jen-Hsun, you’re driving up the hill. You’ll be home in five seconds. Would you like me to go ahead and open up the gate?’ So that by the time I actually got there just a second prior the gate opens up and I pull right into the garage. The AI co-pilot.
We also need the ability to converse and interact with our computer in a very natural way. So natural language processing has to be done inside the car so that if the connection is not very good you can still communicate with the car, and that the reaction time, the latency between your spoken words and its recognition of your speech is as fast as possible. And yet it’s connected to the cloud to an AI assistant. This architecture on top of that is an API called MapWorks. This is one of the most important things we do. MapWorks interacts with all of the mapping companies in the world and it does basically for things.
We basically do four things with mapping companies. The first of course is surveying car. Some cars record all the data and do the processing on GPU supercomputers in the cloud to extract the three-dimensional data from the video. Some survey cars do the processing using a computer like Drive PX, either one of these are something with a discrete — Nvidia discrete GPU so that it could essentially do mapping in real time inside the car, so that’s survey car.
Number two is the GPU supercomputer that’s done — that’s inside the cloud for mapping. Number three, the interface of data, the exchange of data so that our car can always see the HD map in the nearby surroundings of the car. And then number four, as we continuously map the world and notice changes we will update the changes to the live map in the cloud. These four things, these four functionalities is vital to the ability for self-driving cars to be realized in a very high confident way. And so we’ve been working with the world’s leading mapping companies.
We announced several weeks ago, just a few months ago, the leading mapping company in China, Baidu, incredible partner of ours, working across all four layers that I described — all four functionalities that I described, from surveying all the way to map processing, all the way to synchronization with the local computer inside our car. The reason why Baidu was so important is because every car company in the world and any car you build should be able to drive anywhere in the world. And China is now the world’s largest car market. It is too large to ignore and yet only a Chinese company can map China. And Baidu has mapped more of China than any other company. And so partnering with Baidu was a very logical first. We then partnered with TomTom, leading mapping company in Europe. Today we’re really super excited to announce that we’re partnering with ZENRIN, the leading mapping company in Japan. Japan’s roads are incredibly complicated and it is such a dense population, the mapping of Japan is quite an extraordinary task. ZENRIN is an amazing company and we’re working with them to map Japan.
And then also today we’re super excited to announce that we’re working with HERE to integrate Nvidia technology into their data centers for mapping, working with them on map algorithms as well as connecting to all of Nvidia AI car computers inside cars as we synchronize the live maps. Let’s welcome — let’s thank all of our partners to make this — create this incredible dream.
Well, building this ecosystem, building this computer is a gigantic effort. As I mentioned this is the most complex computing problem that we’ve ever tackled. This is high performance computing done in real time and the AI algorithms that we’re developing are all first-of-its-kind. And yet the endeavor is such great importance there are so many companies who could really help us realize this dream. I’m super excited to announce today that ZF is now a partner in helping us turn this computer into a production computer for the automotive industry. ZF is the leading truck and commercial vehicle supplier in Europe. They’re also one of the world’s top five suppliers to the automotive industry. This is an extraordinary company and they are the first to announce a production drive computer, drive AI computer to the market. It’s available commercially for sampling and we’ll ship into production this year. Ladies and gentlemen, ZF.
I have another partner I’d like to announce. There are a lot of cars as we mentioned: a billion on the road, a hundred million sold each year, several hundred million trucks. It’s going to take an enormous, enormous amount of engineering to transform the entire automotive industry into an autonomous industry. And so today we’re announcing that the largest — the number-one automotive technology supplier to the automotive industry, Bosch is going to adopt the Nvidia Drive computer. The largest and the fifth largest automotive supplier in the world have now adopted the Nvidia computing platform so that we can bring AI computers to the autonomous industry. Bosch, a 130 year old company, 375,000 employees, over €70 billion, enormous company, they serve every single car company in the world, unbelievable reach and it’s just such a great pleasure to partner with them. I will be at the Bosch ConnectedWorld in March and well, with a little bit of luck we’re going to give you guys a major update on the work that we’re doing together. So Bosch.
Well, I just have one more announcement. I just have one more announcement. And the momentum behind the work that we’re doing after all of these years is clearly accelerating. And I think people are realizing that creating an AI car computer is a really enormous undertaking. And we’ve worked at it for quite a few years and as you know Nvidia is one of the companies in the world who could specialize in building the most advanced computers from the largest supercomputers in the world to the most advanced gaming PCs in the world to AI car computers. And these computers are just an enormous undertaking, and without a great car company to partner with, it is hard to realize this vision. And so today we’re announcing that Audi and Nvidia will partner together to build the next-generation AI cars.
Audi and Nvidia building an AI car, the world’s most advanced autonomous vehicles powered by Nvidia’s AI car computer, we will have cars on the road by 2020. Let’s welcome Scott Keogh, the President of Audi America to celebrate this moment with us.
Scott, it’s great to have you. Happy New Year!
Scott Keogh – President of Audi America
Thank you. I am glad you saved the best for last, so I appreciate that.
Jen-Hsun Huang – Founder and CEO, Nvidia
You should always go last because you’re always the best. Gosh, we’ve been working together for 10 years and we’ve been building these advanced cars for quite a long time together. You know, frankly when we first started working together the car had no internet. When we started working together there were no internet-connected maps. And when we started working together there were no rich graphics inside the cars that we enjoy. And because of the efforts that we worked on together we now have — we have millions of cars, millions of Audis that are driving all over the world and we really led in technology — information technology. Now the next phase of information technology is artificial intelligence. And when you think about artificial intelligence and the concept of someone who’s been so long in the automotive industry, what is the implication of AI to the automotive world? And what is the implication of AI to Audi?