Skip to content
Home » Iyad Rahwan: What Moral Decisions Should Driverless Cars Make? (Transcript)

Iyad Rahwan: What Moral Decisions Should Driverless Cars Make? (Transcript)

Iyad Rahwan

Iyad Rahwan – Australian-Syrian scientist
Today I’m going to talk about technology and society. The Department of Transport estimated that last year 35,000 people died from traffic crashes in the US alone. Worldwide, 1.2 million people die every year in traffic accidents. If there was a way we could eliminate 90% of those accidents, would you support it? Of course you would. This is what driverless car technology promises to achieve by eliminating the main source of accidents — human error.

Now picture yourself in a driverless car in the year 2030, sitting back and watching this vintage TEDxCambridge video. All of a sudden, the car experiences mechanical failure and is unable to stop. If the car continues, it will crash into a bunch of pedestrians crossing the street, but the car may swerve, hitting one bystander, killing them to save the pedestrians. What should the car do, and who should decide? What if instead the car could swerve into a wall, crashing and killing you, the passenger, in order to save those pedestrians? This scenario is inspired by the trolley problem, which was invented by philosophers a few decades ago to think about ethics.

Now, the way we think about this problem matters. We may, for example, not think about it at all. We may say this scenario is unrealistic, incredibly unlikely, or just silly. But I think this criticism misses the point because it takes the scenario too literally. Of course, no accident is going to look like this; no accident has two or three options where everybody dies somehow. Instead, the car is going to calculate something like the probability of hitting a certain group of people, if you swerve one direction versus another direction, you might slightly increase the risk to passengers or other drivers versus pedestrians. It’s going to be a more complex calculation, but it’s still going to involve trade-offs, and trade-offs often require ethics.

Pages: First |1 | ... | Next → | Last | View Full Transcript