Home » Zeynep Tufekci: We’re Building a Dystopia Just to Make People Click on Ads (Transcript)

Zeynep Tufekci: We’re Building a Dystopia Just to Make People Click on Ads (Transcript)

Zeynep Tufekci

Here is the full transcript of Techno-sociologist Zeynep Tufekci’s TED Talk: We’re Building a Dystopia Just to Make People Click on Ads…

Zeynep Tufekci – Techno-sociologist

So when people voice fears of artificial intelligence, very often, they invoke images of humanoid robots run amok. You know? Terminator? You know, that might be something to consider, but that’s a distant threat.

Or, we fret about digital surveillance with metaphors from the past “1984,” George Orwell’s “1984,” it’s hitting the bestseller lists again. It’s a great book, but it’s not the correct dystopia for the 21st century.

What we need to fear most is not what artificial intelligence will do to us on its own, but how the people in power will use artificial intelligence to control us and to manipulate us in novel, sometimes hidden, subtle and unexpected ways. Much of the technology that threatens our freedom and our dignity in the near-term future is being developed by companies in the business of capturing and selling our data and our attention to advertisers and others: Facebook, Google, Amazon, Alibaba, Tencent.

Now, artificial intelligence has started bolstering their business as well. And it may seem like artificial intelligence is just the next thing after online ads. It’s not. It’s a jump in category. It’s a whole different world, and it has great potential.

It could accelerate our understanding of many areas of study and research. But to paraphrase a famous Hollywood philosopher, “With prodigious potential comes prodigious risk”. Now let’s look at a basic fact of our digital lives, online ads. Right? We kind of dismiss them. They seem crude, ineffective.

We’ve all had the experience of being followed on the web by an ad based on something we searched or read. You know, you look up a pair of boots and for a week, those boots are following you around everywhere you go. Even after you succumb and buy them, they’re still following you around. We’re kind of inured to that kind of basic, cheap manipulation. We roll our eyes and we think, “You know what? These things don’t work.” Except, online, the digital technologies are not just ads.

ALSO READ:   Google I/O 2014 Keynote Transcript

Now, to understand that, let’s think of a physical world example. You know how, at the checkout counters at supermarkets, near the cashier, there’s candy and gum at the eye level of kids? That’s designed to make them whine at their parents just as the parents are about to sort of check out. Now, that’s a persuasion architecture. It’s not nice, but it kind of works. That’s why you see it in every supermarket.

Now, in the physical world, such persuasion architectures are kind of limited, because you can only put so many things by the cashier. Right? And the candy and gum, it’s the same for everyone, even though it mostly works only for people who have whiny little humans beside them.

In the physical world, we live with those limitations. In the digital world, though, persuasion architectures can be built at the scale of billions and they can target, infer, understand and be deployed at individuals one by one by figuring out your weaknesses, and they can be sent to everyone’s phone private screen, so it’s not visible to us. And that’s different. And that’s just one of the basic things that artificial intelligence can do.

Now, let’s take an example. Let’s say you want to sell plane tickets to Vegas. Right? So in the old world, you could think of some demographics to target based on experience and what you can guess. You might try to advertise to, oh, men between the ages of 25 and 35, or people who have a high limit on their credit card, or retired couples. Right? That’s what you would do in the past.

With big data and machine learning, that’s not how it works anymore. So to imagine that, think of all the data that Facebook has on you: every status update you ever typed, every Messenger conversation, every place you logged in from, all your photographs that you uploaded there. If you start typing something and change your mind and delete it, Facebook keeps those and analyzes them, too.

ALSO READ:   Forget Wi-Fi. Meet The New Li-Fi Internet by Harald Haas (Transcript)

Increasingly, it tries to match you with your offline data. It also purchases a lot of data from data brokers. It could be everything from your financial records to a good chunk of your browsing history. Right? In the US, such data is routinely collected, collated and sold. In Europe, they have tougher rules.

So what happens then is, by churning through all that data, these machine-learning algorithms — that’s why they’re called learning algorithms — they learn to understand the characteristics of people who purchased tickets to Vegas before. When they learn this from existing data, they also learn how to apply this to new people.

So if they’re presented with a new person, they can classify whether that person is likely to buy a ticket to Vegas or not. Fine. You’re thinking, an offer to buy tickets to Vegas. I can ignore that. But the problem isn’t that.

The problem is, we no longer really understand how these complex algorithms work. We don’t understand how they’re doing this categorization. It’s giant matrices, thousands of rows and columns, maybe millions of rows and columns, and not the programmers and not anybody who looks at it, even if you have all the data, understands anymore how exactly it’s operating any more than you’d know what I was thinking right now if you were shown a cross section of my brain.

It’s like we’re not programming anymore, we’re growing intelligence that we don’t truly understand. And these things only work if there’s an enormous amount of data, so they also encourage deep surveillance on all of us so that the machine learning algorithms can work. That’s why Facebook wants to collect all the data it can about you. The algorithms work better. So let’s push that Vegas example a bit.

What if the system that we do not understand was picking up that it’s easier to sell Vegas tickets to people who are bipolar and about to enter the manic phase. Such people tend to become overspenders, compulsive gamblers. They could do this, and you’d have no clue that’s what they were picking up on. I gave this example to a bunch of computer scientists once and afterwards, one of them came up to me. He was troubled and he said, “That’s why I couldn’t publish it.”

ALSO READ:   Microsoft Surface Pro 3 - Launch Event Transcript

Pages: First |1 | ... | | Last | View Full Transcript