Sundar Pichai at Google I/O 2019 Keynote (Full Transcript)

Sundar Pichai at Google I/O 19

Following is the full transcript of the entire Google I/O 2019 developer keynote event. Google’s CEO Sundar Pichai and the team announced latest products and services that the company provides. This event occurred on May 7, 2019 at Shoreline Amphitheatre, Mountain View, California, United States.

Speakers at the event:

Sundar Pichai – CEO, Google

Aparna Chennapragada – VP of Product for AR and VR, Google

Scott Huffman – Vice President, Google Assistant

Stephanie Cuthbertson – Senior Director for Android

Rick Osterloh – SVP of Hardware

Sabrina Ellis – VP of Product Management

Jeff Dean – Lead of Google AI

Lily Peng – Product Manager, Google AI Healthcare Team


Sundar Pichai – CEO, Google

Good morning. Good morning. Wonderful to be back here at Shoreline with all of you.

It’s been a really busy few months for us at Google. We just wrapped up Cloud Next in San Francisco with over 30,000 attendees, as well as YouTube Brandcast last week in New York.

Of course, today’s about you all, our developer community. And thank you all for joining us in person, and to the millions around the world watching on livestream.

I would love to say welcome in all our languages our viewers speak, but we are going to keep the keynote under two hours, especially since Barcelona kicks off against Liverpool at noon for you. That should be an amazing game.

Every year at I/O, we learn and try to make things a little bit better. That’s why we have lots of sunscreen — hope the sun comes out — plenty of water and shade. But this year, we want to make it easier for you to get around. So we are using AR to help.

To get started, open your I/O app and choose Explore I/O. And then you can just point your phone where you want to go. We really hope this helps you get around and answers the number one question people have: where the sessions are. Actually, it’s not that. They want to know where the food is. And we have plenty of it around.

ALSO READ:   What Happens When Our Computers Get Smarter Than We Are? By Nick Bostrom (Transcript)

We also have a couple of Easter eggs, and we hope you enjoy them as well. This is a pretty compelling use case. And we actually want to generalize this approach so that you can explore and navigate the whole world that way. There’s a lot of hard work ahead. And it’s a hard computer science problem. But it’s the type of challenge we love.

Tackling these kinds of problems is what has kept us going for the past 21 years. And it all begins with our mission to organize the world’s information and make it universally accessible and useful.

And today, our mission feels as relevant as ever. But the way we approach it is constantly evolving. We are moving from a company that helps your find answers to a company that helps you get things done.

This morning, we’ll introduce you to many products built on a foundation of user trust and privacy. And I’ll talk more about that later.

We want our products to work harder for you, in the context of your job, your home, and your life. And they all share a single goal: to be helpful, so we can be there for you in moments big and small over the course of your day. For example, helping you write your emails faster with automatic solutions from Smart Reply, and giving you the chance to take them back if you didn’t get it right the first time, helping you find the fastest route home at the end of a long day, and when you get there, removing distractions so that you can spend time with the people most important to you.

And when you capture those perfect moments, backing them up automatically so you never lose them.

Simply put, our goal is to build a more helpful Google for everyone. And when we say “helpful,” we mean giving you the tools to increase your knowledge, success, health, and happiness. We feel so privileged to be developing products for billions of users. And with that scale comes a deep sense of responsibility to create things that improve people’s lives.

ALSO READ:   Transcript: Rand Hindi on How Artificial Intelligence Will Make Technology Disappear at TEDxÉcolePolytechnique

By focusing on these fundamental attributes, we can empower individuals and benefit society as a whole. Of course, building a more helpful Google for us always starts with search and the billions of questions users trust Google with everyday. But there is so much more we can do to help our users.

Last year, we launched a new feature in Google News called Full Coverage. And we have gotten great feedback on it from our users. We’ll be bringing Full Coverage directly to search to better organize results for news-related topics. Let’s take an example.

If you search for “black hole,” we’ll surface the relevant top news. It was in the news recently. We use machine learning to identify different types of stories and give you a complete picture of how a story is being reported from a wide variety of sources. You can click into Full Coverage. It serves as a breadth of content, but allows you to drill down into what interests you.

You can check out different aspects of the story, like how the black hole got its name. You can even now see a timeline of events. And we’ll be bringing this to search later this year.

Podcasts are another important source of information. And we’ll be bringing them directly to search as well. By indexing podcasts, we can surface relevant episodes based on their content, not just the title. And you can tap to listen right in search results, or you can save an episode for listening later on your commute or your Google Home.

These are all examples of how we are making search even more helpful for our users, surfacing the right information in the right context. And sometimes, what’s most helpful in understanding the world is being able to see it visually.

To show you how we are bringing you visual information directly in search, here’s Aparna.

ALSO READ:   How AI Can Bring On a Second Industrial Revolution: Kevin Kelly (Transcript)

Aparna Chennapragada – VP of Product for AR and VR, Google

Whether you’re learning about the solar system or trying to choose a color scheme for your home, seeing is often understanding.

With computer vision and augmented reality, the camera in our hands is turning into a powerful visual tool to help you understand the world around you.

So today, we are excited to bring the camera to Google search, adding a new dimension to your search results — well, actually three dimensions, to be precise. So let’s take a look.

Say you’re a student studying human anatomy. Now, when you search for something like muscle flexion, you can view a 3D model built by Visible Body right from the search results. Pretty cool.

Not only that, you can also place it in your own space. Look, it’s one thing to read about flexion or extension, but seeing it in action right in front of you while you’re studying the concept, very handy.

OK, let’s take another example. Say, instead of studying, you’re shopping for a new pair of shoes. That happens. With New Balance, you can look at shoes up close from different angles, again, directly from search. That way, you get a much better sense for things like, what does the grip look like on the sole, or how they match with the rest of your clothes.

OK, this last example is a really fun one. So you may have all seen a great white shark in the movies. “Jaws,” anyone? But what does it actually look like up close? Let’s find out, shall we?

I have Archana here with me to help with the demo. So let’s go ahead and search for “great white shark” on Google. As you scroll through, you get information on the knowledge panel facts, but also see the shark in 3D directly from the knowledge panel.

Pages: First |1 | ... | | Last | View Full Transcript