Skip to content
Home » TRANSCRIPT: Ex-Google Officer Finally Speaks Out On The Dangers Of AI! – Mo Gawdat

TRANSCRIPT: Ex-Google Officer Finally Speaks Out On The Dangers Of AI! – Mo Gawdat

This is the full transcript of Diary of a CEO podcast titled ‘Ex-Google Officer Finally Speaks Out On The Dangers Of AI! – Mo Gawdat’.

Listen to the audio version here:

TRANSCRIPT:

Steven Bartlett: I don’t normally do this but I feel like I have to start this podcast with a bit of a disclaimer. Point number one, this is probably the most important podcast episode I have ever recorded. Point number two, there’s some information in this podcast that might make you feel a little bit uncomfortable. It might make you feel upset, it might make you feel sad. So I wanted to tell you why we’ve chosen to publish this podcast nonetheless and that is because I have a sincere belief that in order for us to avoid the future that we might be heading towards, we need to start a conversation.

And as is often the case in life, that initial conversation before change happens is often very uncomfortable but it is important nonetheless.

Before this episode starts, I have a small favor to ask from you. Two months ago, 74% of people that watch this channel didn’t subscribe. We’re now down to 69%. My goal is 50%. So if you’ve ever liked any of the videos we’ve posted, if you like this channel, can you do me a quick favor and hit the subscribe button? It helps this channel more than you know, and the bigger the channel gets, as you’ve seen, the bigger the guests get. Thank you and enjoy this episode.

Mo, why does the subject matter that we’re about to talk about matter to the person that’s just clicked on this podcast to listen?

Mo Gawdat: It’s the most existential debate and challenge humanity will ever face. It’s bigger than climate change, way bigger than COVID. This will redefine the way the world is in unprecedented shapes and forms within the next few years. This is imminent. It is, the change is not, we’re not talking 2040. We’re talking 2025, 2026.

Pages: First |1 | ... | Next → | Last | View Full Transcript