And where it’s most critical of all – we’re talking about life and death – is in medicine. Computers are better than we are, as human beings, in several areas already today. We’re talking about here and now; this is not science fiction.
I’m speaking next week at the Universitätsklinikum in Essen, and their radiologists, who are supposed to be some of the best radiologists in Germany, they say that a computer can recognize a tumor on an MRT or a CT, faster and better and more precisely than a human being can. It’s picture analysis, and it’s done very well by computers, especially in medicine where it saves lives.
Now, the robots are getting better and better; they’re looking cute. They have these big baby eyes, a sweet way of looking at you. They can examine your facial expression and adjust their’s.
But don’t be fooled by robots even when they get warm skin, perfume, and they start smelling like us and getting really interesting. They are still machines. They have no warm blood in them. There’s no sex in them. They have no mortality. They’re cold code lines, and they shouldn’t be misunderstood.
Now, I want you to understand what the power of artificial intelligence is, and I have two examples: one is surveillance cameras. Everybody knows that we’re being watched by cameras everywhere, and most people think surveillance is a camera there, and it’s me down here, and it’s watching me: one person, one camera.
Well, that’s because we’re stupid. That’s the way we comprehend the surveillance: one camera, one person. We can’t comprehend it when it goes beyond that.
(Video clip) Narrator: This image was taken 17,500 feet above Quantico, Virginia, and covers 15 square miles.
Yiannis Antoniades: This whole image is at a very, very fine resolution. So if we wanted to know what’s going on in any spot along this image, let’s say near this building at this intersection, everything that is a moving object is being automatically tracked. The color boxes represent that the computer has recognized the moving objects. You can see individuals crossing the street. You can see individuals walking in parking lots. There’s actually enough resolution to be able to see the people waving their arms or walking around and what kind of clothes they wear.
Narrator: Unlike the predator camera that limits field of view, ARGUS-IS melts together videos from each of its 368 chips to create a 1.8 billion pixel video stream. This makes it possible to zoom in and still see tremendous detail. (Video concludes)
And it produces a million terabytes every day. That’s a lot of data. I’m telling you this because – not that the sensors are modern and not that the photography is modern – behind that is a brain, or a cognitive intelligence, and that brain is in a position to analyze everybody down there.
At the same time, in real time, they see where everyone is going. We can understand that when we reduce it to a single person, but we can’t understand it when you’re talking about a hundred thousand people in a city, plus the vehicles which are all recognized.
Due to such systems, they have also redone facial recognition. You probably think facial recognition is from the front, but they’ve redone it to do it from the top because that’s where the drones are. They look at your ears, the way you walk, your head – that’s modern facial recognition.
So, that’s one idea: as a human being, we think of one camera and one person.
This is a little of their things: It’s taking all the details, all the musing after, and then record it, so they can tell where that person was two weeks ago, two months ago, what stores he visited, what his whole behavioral patterns are. That’s all part of the analysis of Argus. These are called “tennis balls” in military and intelligence circles. It’s a new secretive sensoric thing.
Cruise missile would fly into a valley in Afghanistan – this is especially important because the troops have left many of these areas – and would drop literally thousands of these sensor packages or tennis balls – they’re all packed in foam rubber. They record with cameras. They record with microphones. They record with seismic measurements.
They record with Geiger counters. They record with chemical sensors, they can look for chemical things. That’s not the amazing part of it, and it’s not the amazing part that their signal goes to a transmitter and then up to the satellite: old technology, nothing special.
The special part of it is, behind that system, there’s a fusion software that can combine the audio and the visual and the seismic and the chemical, all of these signals, and make sense of them, and analyze on the ground what kind of troop movements there are, the kinds of vehicles they’re using, what are they transporting, and if there’s radioactivity in that.