YUVAL NOAH HARARI: Yeah, I think that, I don’t know about the present, but looking to the future, it’s not the Mexicans or Chinese who will take the jobs from the people in Pennsylvania, it’s the robots and algorithms. So unless you plan to build a big wall on the border of California — the wall on the border with Mexico is going to be very ineffective.
And I was struck when I watched the debates before the election, I was struck that certainly Trump did not even attempt to frighten people by saying the robots will take your jobs. Now even if it’s not true, it doesn’t matter. It could have been an extremely effective way of frightening people and galvanizing people: “The robots will take your jobs!” And nobody used that line. And it made me afraid, because it meant that no matter what happens in universities and laboratories, and there, there is already an intense debate about it, but in the mainstream political system and among the general public, people are just unaware that there could be an immense technological disruption — not in 200 years, but in 10, 20, 30 years.
And we have to do something about it now, partly because most of what we teach children today in school or in college is going to be completely irrelevant to the job market of 2040, 2050. So it’s not something we’ll need to think about in 2040. We need to think today what to teach the young people.
CHRIS ANDERSON: Yeah, no, absolutely. You’ve often written about moments in history where humankind has entered a new era, unintentionally. Decisions have been made, technologies have been developed, and suddenly the world has changed, possibly in a way that’s worse for everyone. So one of the examples you give in “Sapiens” is just the whole agricultural revolution, which, for an actual person tilling the fields, they just picked up a 12-hour backbreaking workday instead of six hours in the jungle and a much more interesting lifestyle. So are we at another possible phase change here, where we kind of sleepwalk into a future that none of us actually wants?
YUVAL NOAH HARARI: Yes, very much so. During the agricultural revolution, what happened is that immense technological and economic revolution empowered the human collective, but when you look at actual individual lives, the life of a tiny elite became much better, and the lives of the majority of people became considerably worse. And this can happen again in the 21st century.
No doubt the new technologies will empower the human collective. But we may end up again with a tiny elite reaping all the benefits, taking all the fruits, and the masses of the population finding themselves worse than they were before, certainly much worse than this tiny elite.
CHRIS ANDERSON: And those elites might not even be human elites. They might be cyborgs or –
YUVAL NOAH HARARI: Yeah, they could be enhanced super humans. They could be cyborgs. They could be completely nonorganic elites. They could even be non-conscious algorithms. What we see now in the world is authority shifting away from humans to algorithms. More and more decisions — about personal lives, about economic matters, about political matters — are actually being taken by algorithms.
If you ask the bank for a loan, chances are your fate is decided by an algorithm, not by a human being. And the general impression is that maybe Homo sapiens just lost it. The world is so complicated, there is so much data, things are changing so fast, that this thing that evolved on the African savanna tens of thousands of years ago — to cope with a particular environment, a particular volume of information and data — it just can’t handle the realities of the 21st century, and the only thing that may be able to handle it is big-data algorithms. So no wonder more and more authority is shifting from us to the algorithms.
CHRIS ANDERSON: So we’re in New York City for the first of a series of TED Dialogues with Yuval Harari, and there’s a Facebook Live audience out there. We’re excited to have you with us. We’ll start coming to some of your questions and questions of people in the room in just a few minutes, so have those coming Yuval, if you’re going to make the argument that we need to get past nationalism because of the coming technological danger, in a way, presented by so much of what’s happening we’ve got to have a global conversation about this.
Trouble is, it’s hard to get people really believing that, I don’t know, AI really is an imminent threat, and so forth. The things that people, some people at least, care about much more immediately, perhaps, is climate change, perhaps other issues like refugees, nuclear weapons, and so forth. Would you argue that where we are right now that somehow those issues need to be dialed up? You’ve talked about climate change, but Trump has said he doesn’t believe in that. So in a way, your most powerful argument, you can’t actually use to make this case.