Joy Buolamwini – TRANSCRIPT
Hello, I’m Joy, a poet of code, on a mission to stop an unseen force that’s rising, a force that I called “the coded gaze,” my term for algorithmic bias. Algorithmic bias, like human bias, results in unfairness. However, algorithms, like viruses, can spread bias on a massive scale at a rapid pace. Algorithmic bias can also lead to exclusionary experiences and discriminatory practices. Let me show you what I mean.
(Video) Joy Boulamwini: Hi, camera I’ve got a face. Can you see my face? No-glasses face? You can see her face. What about my face? I’ve got a mask. Can you see my mask? Joy Boulamwini: So how did this happen? Why am I sitting in front of a computer in a white mask, trying to be detected by a cheap webcam? Well, when I’m not fighting the coded gaze as a poet of code, I’m a graduate student at the MIT Media Lab, and there I have the opportunity to work on all sorts of whimsical projects, including the Aspire Mirror, a project I did so I could project digital masks onto my reflection. So in the morning, if I wanted to feel powerful, I could put on a lion If I wanted to be uplifted, I might have a quote.
So I used generic facial recognition software to build the system, but found it was really hard to test it unless I wore a white mask. Unfortunately, I’ve run into this issue before. When I was an undergraduate at Georgia Tech studying computer science, I used to work on social robots, and one of my tasks was to get a robot to play peek-a-boo, a simple turn-taking game where partners cover their face and then uncover it saying, “Peek-a-boo!” The problem is, peek-a-boo doesn’t really work if I can’t see you, and my robot couldn’t see me. But I borrowed my roommate’s face to get the project done, submitted the assignment, and figured, you know what, somebody else will solve this problem. Not too long after, I was in Hong Kong for an entrepreneurship competition.
Pages: First |1 | ... | Next → | Last | View Full Transcript