Here is the transcript of Sasha Luccioni’s talk titled “AI Is Dangerous, But Not For The Reasons You Think” at TED conference.
In her TED talk, “AI Is Dangerous, But Not For The Reasons You Think,” Sasha Luccioni, an AI researcher, discusses the immediate, tangible impacts of AI, rather than hypothetical future risks. She highlights the environmental cost of AI models, emphasizing their substantial energy consumption and carbon emissions.
Luccioni also addresses issues of copyright infringement and the unauthorized use of artists’ and authors’ works in AI training. She delves into the biases inherent in AI systems, such as racial and gender biases, and their real-world consequences. Lastly, Luccioni advocates for the development of tools to understand, measure, and mitigate these impacts, urging for a more responsible and sustainable approach to AI development.
Listen to the audio version here:
So, I’ve been an AI researcher for over a decade. A couple of months ago, I got the weirdest email of my career. A random stranger wrote to me saying that my work in AI is going to end humanity. Now, I get it, AI is so hot right now.
The Impact of AI in Society
It’s in the headlines pretty much every day, sometimes because of really cool things like discovering new molecules for medicine or that dope Pope in the white puffer coat. But other times, the headlines have been really dark, like that chatbot telling a guy he should divorce his wife or that AI meal planner app proposing a recipe featuring chlorine gas. And in the background, we’ve heard a lot about doomsday scenarios, existential risks, and the singularity, with letters being written and events being organized to make sure that doesn’t happen.
Now, I’m a researcher who studies AI’s impacts on society, and I don’t know what’s going to happen in 10 or 20 years, and nobody really does. But what I do know is that there are some pretty nasty things going on right now, because AI doesn’t exist in a vacuum. It is part of society, and it has impacts on people and the planet.