Google I/O 2016 Keynote Full Transcript

Here is the full transcript of the Google I/O 2016 conference keynote – the company’s yearly developer conference held at Shoreline Amphitheater in Mountain View on May 18, 2016.

 

Speakers:

Sundar Pichai – CEO, Google

Mario Queiroz – Vice President of Product Management, Google

Erik Kay – Engineering Director at Google

Rebecca Michael – Head of Marketing, Communication Products at Google

Dave Burke – VP of Engineering, Android

Clay Bavor – ‎VP, Virtual Reality at Google

David Singleton – Director, Android Wear

Jason Titus – VP, Developer Products Group at ‎Google

Stephanie Cuthbertson – Group Product Manager, Android Studio

Ellie Powers – Product Manager of Google Play

 

 

YouTube Video:

 

 

 

 

Sundar Pichai – CEO, Google

Welcome! Welcome to Google I/O and welcome to Shoreline. It feels really rice and different up here. We’ve been doing it for many, many years in Moscone, and in fact, we’ve been doing I/O for 10 years but I feel we are at a pivotal moment in terms of where we are going as a company and felt it appropriate to change the venue.

Doing it here also allows us to include a lot more of you. There are over 7,000 of you joining in person today. And later today, after the keynote, you’ll be joined by several Googlers, product managers, engineers, and designers, so hopefully you’ll engage in many, many conversations over the three days.

As always, I/O is being live streamed around the world. This year we have the largest-ever audience. We are live streaming this to 530 external events in over a hundred countries around the world, including Dublin, which is a major tech hub in Europe, Istanbul, which is our oldest Google developer group, and even to Colombo, Sri Lanka, which is the largest attendance outside of the US with 2,000 people.

Our largest developer audience on the live stream is from China today with over 1 million people tuning in live from China, so welcome to those users as well.

We live in very, very exciting times. Computing has had an amazing evolution. Stepping back, Larry and Sergey founded Google 17 years ago with the goal of helping users find the information they need. At the time, there were only 300 million people on line. Most of them were on big physical computers, on slow internet connections.

Fast-forward to today, thanks to the rate at which processors and sensors have evolved, it is truly the moment of mobile. There are over 3 billion people connected and they are using the internet in ways we have never seen before. They live on their phones. They use it to communicate, learn new things, gain knowledge, entertain themselves. They tap an icon, expect a car to show up. They talk to their phones and even expect music to play in the living room, or sometimes groceries to show up at the front door.

ALSO READ:   Eric Winsborrow: Confessions of a Cyber Spy Hunter at TEDxVancouver (Transcript)

So we are pushing ourselves really hard so that Google is evolving and staying a step ahead of our users. All the queries you see behind me are live queries coming in from mobile. In fact, today, over 50% of our queries come from mobile phones. And the queries in color you see behind me are voice queries. In the US, on our mobile app in Android, one in five queries — 20% of our queries — are voice queries and that share is growing.

Given how differently users are engaging with us, we want to push ourselves and deliver them information, rich information, in the context of mobile. This is why, if you come to Google today and search for Beyoncé, you don’t just get ten blue links. You get a rich information card with music. You can listen to her songs, find information about upcoming show events, and book it right there.

You can come and ask us different queries. Presidential elections or Champions League, and we again give you rich in-depth information. And we do this across thousands and thousands of categories globally at scale. You can come to Google looking for news as well. For example, if you’re interested in Hyperloop, an exciting technology, we give you information with amp right there in search results and they load instantly and you can scroll through them.

Amazing to see how people engage differently with Google. It’s not just enough to give them links. We really need to help them get things done in the real world. This is why we are evolving search to be much more assistive. We’ve been laying the foundation for this for many, many years through investments in deep areas of computer science. We built the Knowledge Graph. We today have an understanding of 1 billion entities: people, places, and things and the relationships between them in the real world.

We have dramatically improved the quality of our voice recognition. We recently started training our data sets with noisy backgrounds deliberately, so that we can hear people more accurately. The quality has improved recently by 25%.

Image recognition and computer vision. We can do things which we never thought we could do before. If you’re in Google Photos today, and you search for hugs, we actually pull all the pictures of people hugging in your personal collection. We have recently extended this to videos, so you can say “show me my dog videos” and we actually go through your videos and pull out your favorite videos.

Pages: First | 1 | 2 | 3 | ... | Next → | Last | Single page view