
Missed the Google’s ‘Made by Google’ keynote event? Well, here is the full transcript of Google’s Pixel Launch – ‘Made by Google’ hardware event where the company unveiled its Pixel and Pixel XL smartphones. This event occurred on October 4, 2016 in San Francisco.
Speaker (s):
Dinesh – HBO sitcom Silicon Valley
Gilfoyle – HBO sitcom Silicon Valley
Sundar Pichai – CEO, Google
Rick Osterloh – SVP, Hardware Group
Brian Rakowski – VP, Product Management
Sabrina Ellis – Director of Product Management at Google
Clay Bavor – Head of VR at Google
Adrienne McCallister – Google Director of Global Partnerships for VR/AR
Mario Queiroz – Google Vice President of Product Management
Rishi Chandra – Google group product manager
Scott Huffman – Engineering Director at Google
TRANSCRIPT:
Dinesh: The Google keynote is about to start.
Gilfoyle: Okay.
Dinesh: You’re gonna watch?
Gilfoyle: No.
Dinesh: Why? Don’t you want to know what it’s about?
Gilfoyle: I know what it’s about.
Dinesh: You don’t know what it’s about, they’re going to tell everyone what it’s about during the keynote.
Gilfoyle: No, they’re going to tell everybody who wasn’t on the beta what it’s about.
Dinesh: There is no beta. Was there a beta? Did you get on the beta? How did you get on the beta? Why wasn’t I on the beta?
Gilfoyle: They only give the beta to qualified industry professionals.
Dinesh: Oh yeah, well then what am I? Don’t answer that…
Gilfoyle: A virgin.
Dinesh: Duh…
Gilfoyle: Next question.
Dinesh: You can be a qualified industry professional and a virgin, in fact it helps, but I’m not a virgin.
Gilfoyle: Okay. I’ll take your word for it.
Dinesh: Wait, what’s in the box?
Gilfoyle: There was no box.
Dinesh: Is that the keynote thing, the Google thing is in the box? Is it a thing?
Gilfoyle: Is a box a thing? Hmmm…
Dinesh: No if — Is the thing in the box a thing?
Gilfoyle: Do you have any other questions?
Dinesh: Is it multiple things….?
Gilfoyle: (No response)
Dinesh: I don’t care, it’s fine, I’m cool.
Gilfoyle: Got to go.
Dinesh: Where you going?
Gilfoyle: Well, since you asked.
Dinesh: Google after party…
Gilfoyle: Hmmm
Dinesh: That sucks.
Gilfoyle: And I heard Larry is going to fire Sergey out of a cannon.
Dinesh: You know what? Maybe I will wait up for you. What are you going to do about it?
Gilfoyle: Probably have a really great night without you. If you want to know.
(Door closes)
Dinesh: Stupid Google keynote. Oh, here we go!
[Google: Hi Doug, how can I help?]
Sundar Pichai – CEO, Google
Google morning. Thank you for being here. We are being joined by many people at global launch events around the world so welcome to all of them as well. We also have a Live Stream and so lot of folks are joining on the Live Stream, including Dinesh, I’m told. So welcome to everyone.
This space used to be a power station for a chocolate factory in the 1900’s. So we have transformed it pretty well, but I’m glad we have a historic setting as we talk about what we are building for the future.
At I/O earlier this year, we talked the about our vision for that future. We are at a seminal moment in computing. If you step back and think about it, computing has always had big shifts every 10 years or so. It all started in the ‘80s, early ‘80s when the personal computer reached the mainstream. It was the first time computing touched the lives of many people and revolutionized the way they worked.
Roughly a decade later, in the mid ‘90s, the Web arrived. It is arguably the biggest platform shift we have seen in our lifetimes. It brought the Internet to many more people, radically changed industries, fundamentally changed how people interact and connect with each other.
Mobile first to an AI first world
About 10 years later, in the mid 2000s with the advent of smartphone we had the mobile revolution. That brought computing to probably now around half the world’s population. And it’s profoundly changing people’s lives. And the shift continues. In fact, when I look ahead at where computing is headed, it’s clear to me that we are heading — evolving from a mobile first to an AI first world.
What do I mean by that? In this world, computing will be universally available. It will be there everywhere in the context of a user’s daily life. People will be able to interact with it naturally and seamlessly than ever before. And above all else, it will be intelligent. It will help users in more meaningful ways.
At Google, we are very, very excited about the shift and we have been working for a long time towards this shift. At the heart of these efforts is our goal to build a Google Assistant. We spoke about the assistant earlier at Google I/O this year. The envisioned assistant is a two-way conversation, a natural dialogue between our users and Google to help them get things done in the real world. The assistant will be universal, it will be available when the users need it to help them. And our goal is to build a personal Google for each and every user. Just like we built a Google for everyone, we want to build each user his or her own individual Google.
To capture our vision for what the Google Assistant is, we put together a short video. So let’s take a look.
[Video Starts: When we started, we made this for everyone so that everyone could find anything they need among the millions of zillions of things in the world. Today it seems like sometimes it’s easy to feel like you need a little help with the stuff just in your own world, your photos, phone, videos, calendars, messages, friends, trips, reservations, and so on, and so on. Wouldn’t it be nice if you had some help with all of that? Wouldn’t it be nice if you had a Google for your world? That’s why we are building the Google Assistant. Hi, Amy, how can I help? You ask it what you need. Okay, Google, what do I have to do today? And your assistant understands and helps you out. You can even carry on a conversation with it. How long will it take to get to downtown Chicago from home. What restaurants are there. Book a table at Cortina restaurant. Sure and the assistant is always there for you, so if you are on the road you can ask it where to fill up. And if you are at home you can ask it to play some music or if you are in a chat with a friend, it can show you what is playing tonight. It’s like your own personal Google. Naturally anything you share with it is safe and secure. The more you use your Google Assistant, the more useful it becomes. Remember my bike combo is 326. Got it? And soon you will be able to access it from all sorts of places. So it will be everywhere you are. We made this for everyone. And today we are making this just for you. Hi, how can I help? Meet your Google Assistant. – Video Ends]
So as you can see that’s our early vision for how we want to build the Google assistant. We are just getting started but in many ways we have been working hard at this problem ever since Google was founded 18 years ago. We have invested in deep areas of computer science. Today our Knowledge Graph has over 70 billion facts about people, places and things and we can answer questions based on that.
Our natural language processing is what helps us make Google truly conversational with our users. And we have built state-of-the-art machine translation, image recognition and voice recognition systems. And each of these areas is being turbo charged by the progress we are seeing with Machine Learning and AI.
A few months ago, we captured the world’s attention when Deep Mind’s AlphaGo won the World Go championship against Lee Sedol, one of the finest players of our generation. It showed the external world a moment for AI has arrived but for us, the progress has been continuous and the strides are huge. In fact, in the three months since AlphaGo played that game, we have had meaningful launches and how Machine Learning is impacting the products we build.
Image captioning
Let me talk about three examples, all of which we have talked about in the past three months since the AlphaGo moment. First, image captioning. Image captioning is how computers try to make sense of images they look at. And we first launched our Machine Learning system in 2014. It was state of the art system and our quality was around just over 89%. Our newer Machine Learning systems now, the quality is close to 94%. 4% may not sound like much to you, but first, it’s really hard to increase quality at these levels because we are trying to approach human level accuracy.
And, second, every single percentage point translates into meaningful difference for our users. So, for example, if you take a look at the picture behind me, about two years ago, we used to understand this picture as a train is sitting on the tracks. Now, we understand the colors so we describe it as a blue and yellow train is traveling down the tracks.
Or if you look at this picture, two years ago we understood it as a brown bear is swimming in the water. Now, we can count, our computing systems can count, so we understand this is two brown bears sitting on top of rocks. Advances like this is what helps us when you are in Google Photos find the exact pictures you are looking for and be a better assistant to you.
Machine translation
Another example, machine translation. We have been doing machine translation for a while. And historically our systems are statistics based and we translate at a phrase by phrase level. So we translate individual phrases and combine them to form a translation. So if you look at this Chinese to English translation, you can see it makes sense, but it’s not quite the way humans would translate it. Just recently we announced our first end-to-end self-learning, deep learning machine translation systems. Rather than working at a phrase level, they take entire sentences and model sentences as outputs. And that’s what you see in the middle and you can see it as approaching human level translation. You can look at this quantitatively, and we have a beta to measure these things quantitatively, and if you look at our previous phrase based system it was quite far from the human system and we closed a significant gap with our new Machine Learning systems.
In fact, the progress for Chinese to English is so significant, last week we rolled it out in production and so today if you pick the Google translate mobile app and try to translate from Chinese to English, you are using our newer Machine Learning systems and the progress has been amazing. We’ll literally translate billions of queries over the coming year. This is what will help us if a user in Indonesia is using the Google Assistant, we can find the right answer even if it doesn’t exist in their local language, translate it on the fly and get it to them.
Text to speech
Another example, text to speech. Text to speech is what we call when computers read something aloud back to you. So when you ask Google a question: Who is the Prime Minister of Canada? We understand the text and try to make it as natural as possible for you.
[Google Assistant: Justin Trudeau is the prime minister of Canada]
So this is text to speech. The way we do it today we get a speaker into our recording studios, we record them for thousands of hours. We make them say short phrases and then combine them to be as natural sounding as possible. Again, deep learning is showing the way. DeepMind just published a paper with a new technology called WaveNet. It’s a deep learning model where rather than modeling phrases they actually model the underlying raw audio waves to generate a much more natural sound. You can again see the WaveNet model is getting much closer to human speech. To me the reason this gets exciting is today all we can do is a single voice for the assistant for all context. Doing this is what will enable us to have multiple voices, multiple personalities, get the assistant to differentiate between German and Swiss German and one day even capture emotions when speaking to you. This is key to our vision for building an individual Google for each user, and more importantly, the assistant will continuously get better as we make progress with Machine Learning and AI.
It is early days but we are committed to this vision and we are going to work on it for a long time. But it’s equally important to get the assistant in the hands of our users. And that’s what today is about.
In fact, we started doing it about two weeks ago with our new messaging app, Google Allo in which users can invoke the Google assistant in group conversations. And the early reception has been great. They are interacting with it very naturally, asking us questions we expected like tell me a joke, but also questions we didn’t expect like what is the meaning of life. So it’s early days, but the assistant continuously learns from this experience and keeps getting better.
If you remember our vision for Google Assistant is to be universal, to be there everywhere the user needs it to be, which is why today we are going to bring the assistant to two more surfaces, one in the context of the phone which you always carry with you, and one in the context of your home.
To talk about the Google Assistant in new hardware products, let me invite Rick Osterloh, the head of our newly formed hardware group.
Rick Osterloh – SVP, Hardware Group
Thank you. Good morning. I am Rick. It’s an honor to be here today representing the hard work of so many of my colleagues. Well, I’ve been doing hardware for a long time and even I smile like a kid every time I get to unbox a new gadget.
Since I joined Google, one of the questions I get asked most often is: why should we build hardware? I mean we often joke that building hardware is, well, hard. People have strong, emotional connections to the products they rely on everyday. They are an important part of our users’ lives. But the rise of volume and complexity of all of the information makes it so that this is the right time to be focused on hardware and software. Let’s think about that for a moment.
At the peak of film photography, 80 billion photos were captured every year. But last year thanks to smartphones, 1 trillion photos were taken. Communications has gotten similarly complex. 328 billion items were delivered by the Post Office last year. And that’s compared to 50 trillion emails and mobile messages. And today people want more than a thousand songs in their pocket. What they want is the entire world’s music collection with them at all times. These informational changes mean the technology needs to be smart, and just work for you. This is why we believe the next big innovation is going to take place at the intersection of hardware and software with AI at the center. That’s where we have the biggest opportunity to bring people the very best of Google as we intended it.
Building hardware and software together lets us take full advantage of capabilities like the Google assistant. It lets us harness years of expertise we have built up in Machine Learning and AI to deliver the simple, smart and fast experiences that our users expect from us. It allows Google to be helpful to people where they need us no matter what the context or form factor.
As you will see today, we are building hardware with the Google assistant at its core so you can get things done without worrying about the underlying technology. Our devices just work for you, whether you are at home, with family, commuting to work, out for a jog or spending time with friends. This is something that Google has always stood for.
Hardware isn’t a new area for Google, but now we are taking steps to showcase the very best of Google across a family of devices designed and built by us. This is a natural step and we are in it for the long run. You are going to hear much more from our team in the coming months and years to come, and we have lots in store for you today.
Pixel
So let’s get started, first with phones. Phones are the most important device we own. They rarely leave our side. They are literally most people’s lifeline to the Internet and to each other. So today I am very excited to introduce you to a new phone made by Google. We call it Pixel. For those of you who followed Google for a while, that name might sound familiar to you. For us, it’s always represented the best of hardware and software designed and built by Google.
Let me tell you a little bit more about Pixel. We designed everything about Pixel from the industrial design to the user experience. Everything is simple and easy to use — something Google has always stood for. I really love how this phone looks and feels. The rear glass creates a bold iconic element that gives Pixel personality and character. The polished aluminum case gives the phone a distinct look, and there’s a subtle wedge from the top to bottom that keeps it thin where your hand most naturally grips it. And there’s no unsightly camera bump.
And while Pixel is beautiful, what really makes it come to life is how the hardware and software work together. It’s the perfect example of how the best of Google smarts combines to make a great, simple user experience. So today we’re going to tell you about 5 things.
First, we’re excited to announce that Pixel is the first phone with the Google Assistant built in. Second is Pixel’s terrific photography experience. Third, is how we use Google Cloud so you never run out of space for those great photos. Fourth, is how we let people talk to each other much more easily no matter what operating system or device they use. And finally, is now Pixel is made for mobile Virtual Reality.
To tell you more about Pixel, I’d like to invite my colleague, Brian, to the stage.
Brian Rakowski – VP, Product Management
Hi, I’m Brian. I lead the software product management team for Pixel. There have been so many people who worked so hard, so I feel lucky to be the one to show you what we’ve built.
So this phone was designed inside and out to be simple and smart, and I think you’ll notice that right from the Home screen. Let’s switch to a demo. The first thing you’ll see with the new Pixel launcher is round icons, access to all of your apps just a swipe away, and a clean, polished look. And like Rick said, Pixel is the first phone with the Google Assistant built in. Having the Assistant with you, all through your day makes so many tasks incredibly easy. You can just touch and hold the phone’s Home button or say the hotword, and the Assistant jumps into action. Whether you’re on your home screen or in any app, you can always ask your Assistant for information or help with tasks.
Let me show you how it works. Getting ready for today’s launch, I’ve been spending a little bit more time at the office than usual, so I’m looking to plan something with family. And fall is such a great time in the Bay Area, I can get some ideas for the weekend with a long press of the Home button and by saying: Show me my photos from last October?
[Google Assistant: Take a look at these pictures from your Google Photos.]
It’s pretty cool, right? With a quick voice command my assistant found just the right pictures from my Google Photos collection, and it doesn’t just work with dates, you can search for people, places and things too.
So this concert at the Greek Theater at UC Berkeley was pretty fun. So let’s see what shows are coming up. “Show me upcoming events at the Greek Theater in Berkeley?”
So let’s talk about what’s just happened there. Not only did the assistant recognize the venue I asked about, but it was able to serve up relevant information thanks to Google’s Knowledge Graph.
OK, looks like The Lumineers are playing. I like their music, but I haven’t heard their latest stuff. “Play me a song by the Lumineers”
Now the Assistant knows I like to listen to music on YouTube, so that’s the app it opens up.
[Music]
All right. That will be a pretty fun concert. But let’s see if my wife even remembers who I am after three weeks of being of MIA. Text Lisa.
[Google Assistant: Text Lisa Rakowski. Sure, what’s the message?]
Brian Rakowski: Want to go see The Lumineers in Berkeley on Friday?
Google Assistant: Got it. Do you want to send it or change it?
Brian Rakowski: Send it.
Google Assistant: Okay. Message sent.
Brian Rakowski: Now I just sent that message as an SMS. But the Assistant also works with other messaging apps like WhatsApp and Viber. Or I could have asked the Assistant to call Lisa instead. It looks like she responded. Thankfully she wants to go and she is recommending we get dinner at Marzano first. I haven’t been there before, so let’s try to get some more info. Now, watch this. The assistant can also help get information based on what’s on my screen. So I will do the same long press gesture on home and then swipe up to get contextually relevant information.
So here is what just happened. The assistant recognized the restaurant in Lisa’s text message and assembled all this useful information into a single card from the apps I have installed on the phone. I didn’t have to say or type a single word.
So let’s get some more information from Google Maps. There is the location. You can wipe up, check out some pictures. Okay. It looks pretty good. But let’s see if it’s close enough to the concert. How far is it from the Greek Theater in Berkeley?
Google Assistant: If you drive Marzano is 6.4 miles away from the Greek Theater.
Brian Rakowski: Again, the assistant understood the context of my screen to answer my question. So it’s a pretty quick drive, it will totally work. Let’s get reservations. Make a reservation at Marzano for Friday.
Google Assistant: Sure, let’s make a reservation at Marzano with OpenTable. How many people in your party?
Brian Rakowski: Two people.
Google Assistant: And what time do you want to reserve?
Brian Rakowski: 6:00 pm.
Google Assistant: Okay. I have completed the reservation, so you are all set. Look for an email confirmation from OpenTable with all of the details.
Brian Rakowski: So that’s just a glimpse how the Google Assistant works. It can help with the big things like weekend plans, but it’s also great for little things like getting throughout your day, finding information, getting around, setting reminders for yourself and so much more. The assistant is incredibly useful when it’s built right into your phone, and we’ll show you how it works in other contexts later too.
The Google Assistant on Pixel is a great example of how hardware and software come together beautifully. Another is the camera. There’s actually an industry group DxOMark that rates camera quality of almost all popular DSLR and smartphone cameras in the market. These guys are just obsessed with cameras, lenses and image quality. We’re proud to report that Pixel received a rating of 89. That’s the highest rating ever for a smartphone.
So let me put that score into context. This isn’t only the best camera we’ve ever made, it’s the best smartphone camera anyone has ever made. So how did we achieve this? Our teams of photography gurus and image processing experts have spent the last year designing and tirelessly optimizing the entire camera stack. Pixel has a 12.3 megapixel rear-facing camera, and featuring an f/2.0 aperture and a big 1.55-micron pixels to capture significantly more light than other cameras.
The camera on Pixel is just excellent. Our amazing camera team has written some incredible on-device software algorithms that do things you just can’t do with great hardware alone. For instance, Smart Burst is a feature that lets you capture just the right moment. By holding down the shutter button, you can capture a continuous stream of images and let Google intelligence select the sharpest, clearest photo, of just the right moment. There is HDR+, it’s built to work in any light, so you get clear, vivid pictures even in challenging conditions. All the pictures you see here were taken on a Pixel.
Now traditional cameras use a single long exposure in low light, but HDR+ splits that into multiple short exposures, aligning them algorithmically, and then combining each pixel. This technique reduces noise, minimizes blur, and gives you extended dynamic range, as in this example preserving both the dimly lit people in the foreground and the beautiful sunset behind them. So HDR+ clearly improves image quality. And with Pixel the camera uses HDR+ by default because there is zero shutter lag. That’s a big deal, and it processes images twice as fast so you can keep shooting, rapid fire.
Just as important as capturing great photos is the speed of the camera app. We know this is really important. We don’t want you to miss the moment while your camera gets itself ready. And I’m really proud to say that our camera has a shorter capture time than any smartphone camera we’ve tested, which makes action shots like this possible.
And finally, my favorite feature, our incredible video stabilization. It means that videos turn out smooth even if you’re not. So let me tell you how this video was captured. We mounted two cameras side by side, hit record on each and then started walking. The one on the left has video stabilization turned off, the one on the right has it on. Just look at the difference. This works by sampling the gyroscope at 200 times/second to figure out exactly how the camera is moving, even accounting for the rolling shutter and instantaneously compensating in each part of the image so you can avoid that “jello” effect you see with other forms of image stabilization.
Of course, Pixel comes with Google Photos built in. So once you’ve captured the moment with Pixel’s powerful camera, Google Photos helps you store, organize, and share all your photos no matter how goofy they might be. We think people are going to use this camera a lot. So as a special bonus for Pixel owners, we’re including free unlimited storage for photos and videos, at original quality. That’s all your photos and videos at full resolution including the rich 4K videos you can take with this phone. So, with Pixel, you’ll never run out of space for your memories. And you can say goodbye to those painful ‘storage full’ pop-ups.
So that was a quick intro to the Google Assistant on Pixel and our new best-in-class camera integrated with Google Photos. Now I’d like to invite Sabrina up to share some more great features of Pixel. But before that, let’s watch this new spot from our upcoming campaign.
[Video Clip]
Sabrina Ellis – Director of Product Management at Google
Hi, I’m Sabrina, and I work with Brian on the Pixel Product Management team. At Google, we’ve always taken pride in creating things that are both smart and simple. So, we’ve thought a lot about all the different features that make your phone work for you. I can’t wait to share just a few of them with you now.
First, let’s talk about communications. Whether your friends are on Android or iOS, Google Duo, our new video calling app, lets you jump into a call with just a single tap. And it comes pre-installed on Pixel. My favorite feature in Duo is called Knock Knock. It shows you a live video stream of the person calling you before you pick up, so you can see who’s calling and even what they’re doing. As you can see my son likes to have fun with this picture.
Now, let’s talk about something I know you all really care about — the battery. We spent a lot of time optimizing Pixel to be smart about improving battery life. We made sure you can easily get through your day. But, for the times when you need a quick charge — like when you’re about to head out to dinner and you realize your phone has literally no power left — Pixel can get you up to 7 hours of power with just 15 minutes of charging, so you can be on your way.
And of course, Pixel ships with the newest Android operating system, Nougat. Pixel users will get system and security updates as soon as they’re available, directly from Google. We’ve also made the update process easier. When a new update is available, it will be downloaded and installed in the background. This means, the next time you restart your phone, you’ll instantly be using the new version. We just take care of it for you. Gone are the days of staring at a progress bar while waiting for an update to install.
We also really want to make sure you feel supported when you’re using Pixel. So, we’ve built 24/7 live customer care right into the phone. You can reach a Google support agent over phone or chat. And to make it easier for them to understand and help solve the problem, we’ve added a screenshare option so you can let the agent see what you’re seeing.
Now that we’ve given you all these reasons to switch to Pixel, we’ve also made it super simple to transfer over everything that you really care about. We know historically it’s been hard to switch from one OS to another so we built a new tool to let you quickly and easily transfer your contacts, photos, videos, music, texts, calendar appointments and your iMessages. We even put a Quick Switch Adapter in every box.
We’re also excited to introduce a brand new range of cases and accessories for Pixel. These include a new Artworks collection of our Live Cases with custom designs from top artists such as FAILE, and photographers, like Gray Malin and Chris Hadfield.
Finally, Pixel comes in two sizes: A 5-inch and a 5.5-inch display. All the great features you’ve heard about today work on both. And, both of these sizes come with all of these amazing hardware specs.
Pixel is available in 3 colors, descriptively named: Quite Black, Very Silver, and a limited edition, Really Blue.
So, let’s recap the highlights. Pixel is the first phone to ship with the Google Assistant built in. It has the best smartphone camera. It includes unlimited storage for all your photos and videos. It comes with Duo pre-installed. Plus, it’s the first phone to be Daydream VR ready. You’ll be hearing more about VR shortly.
Now, where can you get this stuff? Here in the U.S. we’re teaming up exclusively with Verizon to bring Pixel to the market. We’re proud to work with them again. We’re also excited to be working with many international partners to bring Pixel to the world. Additionally, Pixel will be available, unlocked, on the Google Store. And for you Project Fi fans out there, you’ll be happy to know that Pixel is the latest device to work on the Project Fi network.
Pixel starts at $649 or $27 a month on the Google Store here in the US. It’s available for pre-order starting today in the US, Australia, Canada, Germany and the UK. And in India starting on October 13.
So that’s the new Pixel, the first phone made by Google, inside and out.
Next up is Clay who will talk to you about VR. But, before that, here’s a recap of all the newness of Pixel.
[Video Clip]
Clay Bavor – Head of VR at Google
Hi, I’m Clay. I lead the virtual reality team at Google and I am really excited to get to show you some of the things that we’ve been working on. Just to say it: we love Virtual Reality. For us it’s not just a technology, it’s not just another screen. It’s something that we believe is going to be important, because unlike anything else we’ve seen, it can put you some place else. It’s transporting. With VR, you can put on some goggles and feel, viscerally, like you’re in another world. It’s richer, it’s far more immersive, and we believe it will impact how we explore, how we work, how we play, how we learn.
Daydream View
But to create this sense of immersion takes some powerful technology, and my team’s goal is to make that technology simpler, friendlier, and more accessible. And we’ve been hard at work doing just that with Daydream, our platform for high quality mobile virtual reality. Daydream ties together a bunch of things that you need for great VR: Software optimizations, specs for phones, headsets and controllers and then the apps and experiences themselves — all of this in service of creating a healthy VR ecosystem with our partners, and with developers. And today, we’ve got news in all of these areas.
Let’s start with phones. So you just heard about the new Pixel phones, and they’re great phones. And, as Sabrina said, we’ve made them great for VR, too. We’ve tuned everything from their sensors to their displays. And the Pixel phones are the first which are Daydream-ready, as we call it.
Now, of course, you need some extra hardware to unlock the phone’s VR capabilities. So let’s talk about the VR headset. Now the headset is important to get right. I mean, after all it’s something that you wear on your head. But, we looked out there, and we saw some problems. We saw issues with comfort. We saw stuff that’s pretty hard to use, pretty complicated. And, everything just kind of looked the same. So we looked at all this, and we had some ideas. And, we have a bit of a different take on the VR headset. And so I’d like to introduce you to Daydream View. It’s the first Daydream-ready VR headset.
Now if you are into VR, I’ll just say that the specs are there, has a nice field of view and with the Daydream phone, low latency and really accurate head tracking. But we didn’t just look at the specs, we obsessed over the details of the design. We wanted to make something that’s comfortable, and really easy to use. And we also thought about how you could make it your own.
So first, let’s talk about comfort. How it feels, how it fits? You’ll notice immediately that it doesn’t quite look like other VR headsets. And that’s because, in designing it, we weren’t inspired by gadgets. We looked at what do people actually wear? We wear stuff that’s soft, stuff that’s flexible, and breathable. So we crafted our headset out of that same comfortable stuff: Fabrics, soft micro-fiber, and other materials that you’d find in clothing and athletic wear. In fact, we worked with clothing designers and makers to get the design just right. And the result is something that’s soft and cozy, and feels great to wear. So the materials are really nice.
It’s also lightweight – 30% lighter than similar devices. And one other thing. To see comfortably, some people need some help. People like me. So we made sure that the headset fits nicely over eye-glasses. So, that’s comfort.
Let’s talk about making it easy to use. To start, it’s got to be easy to get into VR. You don’t want to think about cables, and wires, and lots of clicky things and connectors that you have to get just right. All that stuff just kind of gets in the way. Really, you just want to pop your phone in and be in VR. And that’s the way Daydream View works. You open the latch, you drop your phone in, you close the latch and you’re ready to go. That’s it. The headset and phone say hello, wirelessly. So there’s nothing to connect. And the headset includes an auto alignment system that gets everything just right. So getting into VR is easy.
We wanted interacting with VR to be easy, too. And that’s where the Daydream controller comes in. Now the controller is really important, because when you go someplace, you want to be able to do things there. So let’s have a look. That’s really easy to use. At the top is a clickable touchpad, a couple of buttons but there is more to it than meets the eye. Hidden inside are a bunch of sensors that respond to how you move. So you can point, you can swing, you can aim. It’s so precise that you can draw with it. You can write your name. So it’s really powerful, and we’ll have a look at some of the things you can do with it in just a second.
Now, what do you do with a controller after you use it? You lose it. You lose it. You lose it in the couch or in the bottom of your bag. We didn’t want that to happen. So the controller has a home inside the headset. When you’re done, it just sort of snuggles in there, like this. And it’s details like these that we’ve worked hard to get right. These little things that — it’s very well built. These little things that make the whole experience easier and more seamless.
So let’s talk about the last problem. Everything kind of being the same. We wanted you to be able to make it yours. And that starts with phones. So the Pixel phones are the first Daydream-ready phones again, and the headset obviously works great with them. But there are a lot of other Daydream-ready phones on the way from our partners, too, and the headset will work with them as well.
And one last thing. Things you wear don’t come in just one color. And we didn’t see why a VR headset should either. So with Daydream View, you’ll be able to choose your color. At launch we’ll have Slate. Later this year, we’ll add two other colors: Snow and Crimson. So that’s Daydream View – a comfortable easy to use VR headset that you can make your own.
Now, of course, what really matters here is what you can do. There’s a lot to show you, and for that, I’d like to invite up Adrienne McCallister, our Director of Partnerships. Adrienne?
Adrienne McCallister – Google Director of Global Partnerships for VR/AR
Thanks Clay. Hi everyone. My name is Adrienne and I lead VR partnerships for Daydream. The team has been working closely with our partners to bring some incredibly immersive experiences to Daydream. And, I’d like to share just a few of them with you today.
With Daydream you’re going to be able to explore some really magical places. And what’s more magical than the Wizarding World of JK Rowling? I’m excited to announce that we’ve been working with Warner Brothers to bring an exclusive Fantastic Beasts and Where to Find Them experience to Daydream.
[Video Clip]
In it you are a wizard and the controller transforms into a wand that you can use to levitate objects and cast spells. We’re really stoked about this because don’t we all just want to be wizards. We’re also really excited to bring some great educational experiences to Daydream. One out-of-this-world example. We have been working with makers of Star Chart, an app that lets you explore the solar system and learn all about the constellations. With Star Chart on Daydream it’s like having your own personal planetarium where you can fly through the stars and explore new galaxies.
The stars are also an epic place for space battles and Daydream is going to be a stellar place for games. We’re excited about Gunjack 2 from CCP, the makers of EVE. In it, you’re in the cockpit of a spaceship where you’re defending the fleet against alien ships. You can look all around you, and then use the controller to aim the ship’s lasers anywhere you want to blast those alien ships.
Back on earth, for when you want to kick back on the couch and watch something — we’ve been working with the likes of Netflix, HBO, and Hulu to bring their entire entertainment libraries to Daydream. In VR, you can see it all on a big screen, and bring that big screen with you wherever you go.
When you want to turn your attention to current events, The New York Times has been doing important work in VR documentary and news. They’ve told us stories that put us alongside soldiers, and refugees. In VR you see the world from their perspective. It’s powerful journalism brought into virtual reality. These are just a handful of the experiences that are going to be available. Over 50 partners are bringing apps and games to Daydream before the end of the year, and there are hundreds more on the way.
To complement our partners, we’re also bringing “The Best of Google” to Daydream. First, there’s Google Play Movies and their library of shows and films which you can watch on your own big-screen. Then there’s Google Photos, where you can relive your personal memories in a completely immersive way. And then there’s Street View, and YouTube. Let’s take a look.
With Street View in Daydream, you can visit thousands of places in 70 different countries. And we’ve built 150 curated tours of the world’s most amazing places, so you can feel what it’s like to tour the Taj Mahal. There are also some hidden gems in Street View. For instance, you can visit the Faroe Islands and see their beautiful rolling fields from the perspective of a sheep, or as our Faroe Island friends like to refer to it: Sheep View.
And finally there’s YouTube. YouTube on Daydream is amazing. To start, you can access the entire library of YouTube videos — regular videos — and watch them on a cinema-sized screen. Your favorites on YouTube have never looked better or bigger. But where the YouTube experience really shines is with 360 and VR videos, where you’re not just watching a film, you’re in the film. Here is one of my favorites from the London Museum of Natural History. You’re standing in the hall of the museum, looking all around and all of a sudden dinosaurs come back to life right in front of you. I don’t think he’s done yet. In virtual reality he looks even hungrier. It’s just one of the hundreds of thousands of immersive videos on YouTube, and we’re working with YouTube’s creators to bring even more original VR content in the months to come. So, that’s just a glimpse of what’s coming to Daydream from our partners and from Google.
With that I’d like to turn it back to Clay. Thank you.
Clay Bavor – Head of VR at Google
So, that’s the update on Daydream – the first Daydream-ready phones, with the Pixel. A headset that’s really comfortable. A controller that’s powerful and easy to use. And so much to experience. Now all of this will come together in November, when Daydream View and the controller go on sale, together, for $79. So there it is. Our next step in making high quality VR accessible to everyone. We’re really excited about it.
With that, I’d like to turn it over to Mario. Thank you so much.
Mario Queiroz – Google Vice President of Product Management
Hello. I’m Mario. And I lead our product management group within hardware. Let’s talk now about the home. All of us are connected in our homes, more and more through lots of technology, and there’s a lot of opportunity for improving those experiences. Today we are announcing some exciting products as part of our vision for the home, including connectivity, entertainment, and the Google Assistant.
Google Wi-Fi
A great online experience begins with great connectivity. Unfortunately traditional routers were not designed for the way we use Wi-Fi in the home today. We’re streaming, gaming, video-chatting, and more, all throughout the house. Last year, we introduced the OnHub router, with TP-Link and Asus as partners, to make connectivity in the home a lot better. We’re continuing to build on this technology for the benefit of the partner ecosystem, including ISPs. And we’ve integrated the latest advances into our own Wi-Fi product.
Today we are excited to announce Google Wi-Fi designed to support the new ways we use Wi-Fi. First, it’s an expandable system to give you much better coverage. Unlike a single central router, multiple Google Wi-Fi points work together to do a much better job of delivering fast throughput to every corner of the home. It’s modular, so you can get the right fit for your home’s shape and size. Simply add Wi-Fi points to expand coverage.
We’ve also designed the hardware to be visually subtle and fit in anywhere in the home, without compromising aesthetics. On top of great coverage we’ve built a feature set called Network Assist that will actively manage and optimize your network, behind the scenes, so you don’t have to figure out how to adjust your router. Network Assist keeps your signal strong, even as you roam throughout the house, by intelligently transitioning you to the best Wi-Fi point and placing you on the right Wi-Fi channel to avoid congestion.
For those times when you do want control of your Wi-Fi network, we make it easy through a companion app. So at my house, for example, it’s sometimes difficult to get my teenage daughters to pause YouTube or Netflix on their computers and come to the dinner table. Well, Google Wi-Fi lets you manage the Wi-Fi access of the your kids. If dinner is on the fable, you don’t have to shut down the router. Just go to the app and simply press pause on their devices for some quality family time.
Google Wi-Fi is going to be available for pre-orders in November and it’s going to ship in early December. Google Wi-Fi is going to retail for $129 for a single-pack. And for larger homes, we are offering a 3-pack for $299. We believe that Google Wi-Fi is going to make a big and immediate improvement in internet connectivity in your home.
Chromecast Ultra
Our second product announcement today as part of our vision for the home is in the area of entertainment. We have some exciting news around Chromecast. Last year, we launched two new Chromecast products: the second-generation Chromecast and Chromecast Audio. The feedback was very very positive. Overall watch time has increased 160% since last year. We’ve now sold more than 30 million Chromecast devices, and we’re continuing to innovate on Chromecast to deliver the best streaming experience. So, today we are introducing Chromecast Ultra, our most premium TV streaming device to date. Chromecast Ultra brings everything you already love about Chromecast plus even crisper picture and better performance.
You will now be able to cast your favorite content in up to 4K Ultra HD resolution with HDR and Dolby Vision support. You can get the highest contrast ratio and deepest colors from your favorite TV shows on Netflix, mine happens to be Narcos, popular titles from Vudu and YouTube’s extensive catalogue of 4K videos. And we’re happy to announce that Google Play Movies will be rolling out 4K content in November, supported on Chromecast Ultra. Your shows and movies are going to look really really fantastic.
We’ve also significantly improved performance and reliability. Chromecast Ultra is 1.8 times faster to load content and includes major Wi-Fi improvements to support streams from full HD to Ultra HD without a hitch. Plus, Chromecast Ultra comes with an Ethernet port. It’s integrated into the power adapter so it’s simple to connect.
Just like Chromecast, Chromecast Ultra comes in a small, compact size, so it conveniently hides behind your TV. No extra cable, boxes, or clutter. And of course, Chromecast Ultra includes all of the great content already available on Chromecast. Just cast your favorite entertainment from your phone, tablet or laptop.
Google Home
Chromecast Ultra is going to be available November for $69. It showcases the next generation streaming technology. And that brings us to Google Home. At Google I/O, we gave a preview of where we plan on taking Google Home in the next few years. Today, we’re going to share with you the details of the product.
As we heard from Sundar, the Google Assistant works across many facets of your life, including at home. Our homes are special and different from other environments. It’s where we spend time with family and friends, it’s where we get a lot of stuff done, and it’s where we relax and unwind. Many times, you can’t or don’t want to reach for your phone — you might be cooking, doing homework with the kids, or simply sitting with friends at the dinner table. So, we’ve designed a product that works in a hands-free way and makes the Assistant available whenever you need it, simply by asking with your voice.
Google Home lets you enjoy music, get answers from Google, manage everyday tasks and control devices around your home. So here is Google Home. Our design was inspired by consumer products that are commonly found in homes, like wine glasses and candles. We went with a white top and a clean diagonal face for a neutral look that blends in. The top surface has LEDs that provide visual feedback when Google Home recognizes “OK Google”, so you know when it is listening. In those rare moments when voice won’t do, the top surface is also capacitive touch panel. You can simply use your finger to pause the music, adjust the volume, or trigger the Google Assistant.
Google Home was designed with a microphone system for accurate far-field voice recognition. We’ve simulated hundreds of thousands of noisy environments and applied machine learning to recognize patterns that allow us to filter and separate speech from noise. This enables us to deliver best-in-class voice recognition and minimize error rates – even from across the room. The mic mute button gives you total control so the product is only listening for the hotword when you want it to be. The speaker in Google Home sets the standard for audio quality in this new category. It contains a forward facing high-excursion active driver, with dual side-facing passive radiators. This design delivers a full-range, natural sounding experience, with rich bass and clear heights.
Finally, we made it really simple to swap bases to customize your Google Home to match the style of any room in your home. All of these features add up to a beautiful and powerful product.
So now I’m going to invite Rishi Chandra on stage to show in more detail what Google Home can do.
Rishi Chandra – Google group product manager
All right. Thank you, Mario. I’m really excited to show you Google Home. It’s a little hard to replicate a home environment on the stage, so we’ve gone ahead and recorded real examples of Google Home in action. And afterwards, you will have a chance to experience it live upstairs in the demo stations.
OK, so let’s get going. First, Google Home helps you enjoy your music. When I get home, I love to kickback, relax and listen to some tunes and voice is a really simple way to get started. Let me show you. “OK Google, play Don’t Let Me Down by the Chainsmokers”
Google: Don’t Let Me Down by the Chainsmokers. Sure. Playing on YouTube.
Rishi Chandra: It’s really simple. And you’re going to love the sound quality in your own home. And just as easy to control the music with your voice, “OK Google, turn it down”. And the great thing is I never had to pull out a phone or deal with a clunky remote control. Just ask for your favorite song, album and artists, it will play it. Now in this case, we’re using YouTube Music, which has a huge content catalogue, so you can play popular tracks, live concerts, cover songs and more. But Google Home supports many of the top music services, including Google Play Music, Spotify, Pandora, TuneIn, and soon iHeartRadio, so you have a lot of different content options.
For example, let’s say I’m in the mood for little retro workout music. Well, with Google Play Music you can access their expert-curated playlists that fit any mood or activity. Let me show you. OK Google, play 80s workout music from Google Play Music.
Google: All right. Here is a Google Play Music station called head bands and leg warmers, 80’s workout.
Rishi Chandra: This song might bring a lot of memories for many of you. Now it’s really easy to set a default music service as well. So if your favorite music service is Spotify, you don’t have to keep saying “on Spotify” over and over. We want Google Home to match your preferences. And what’s really unique about Google Home is that our music search is powered by Google. So if you don’t remember the exact name of that song, we can still help you. Check this out. OK Google, play that Shakira song from Zootopia.
Google: Try Everything by Shakira. Sure playing on YouTube.
Rishi Chandra: So that’s pretty cool. Even though I didn’t remember the name of the Shakira song, Google was able to infer it from the context of the movie Zootopia. This comes from years of investment in search. So now it is easier than ever to play the music you want to play on Google Home.
Now we want to make it easy to play all types of audio – so music, news, radio stations, podcasts and more. So if you want to listen to your favorite sports podcast, like Pardon the Interruption, or news from NPR, all you need to do is ask for it. And of course, we also support the ability to cast audio to Google Home from your favorite Android or iOS devices, so your options to listen to music and more information is unlimited on the speaker. Add it all up, and we think you’re going to love the combination of a great sounding Wi-Fi speaker with voice control from the Google Assistant. OK. So that’s music.
Number two, Google Home helps you get answers from Google. Our mission has always been to organize the world’s information. And over the past few years, we have gone beyond the 10 blue links into providing direct answers to your questions. We do that through the Knowledge Graph, which as Sundar mentioned, has over 70 billion facts. And now with Google Home, we give you an easy way to access the knowledge of Google in a hands-free way. For example, let’s stay with the music theme for a second. OK Google, what is Adele’s real name?
Google: Adele’s full name is Adele Laurie Blue Adkins.
Rishi Chandra: OK Google, how many Grammy’s has she won?
Google: According to Wikipedia, with a total of 20 awards and 53 nominations from the Grammy awards for her music, she is the most nominated woman in Grammy history.
Rishi Chandra: That’s a lot of Grammy’s.
Now you may have noticed that last answer didn’t come Google, but instead came from Wikipedia. Our knowledge graph can cover a lot of information, but it can’t cover everything. So fortunately, we launched a feature several years ago called Featured Snippets. If we cannot find a direct answer, we try to find the website that has the best answer to your question, and we give you a short quote from it, so you can decide whether to read more and find out more later. It allows you to find answers to a much wider range of questions. So for example, let’s say I want to confirm before the kids spoil the dog with treats. OK Google, can dogs eat chocolate?
Google: According to Hill’s Pet Nutrition, humans easily metabolize theobromine but dogs process it much more slowly allowing it to build up the toxic levels in their system.
Rishi Chandra: So that sounds like a no.
Or let’s say I accidentally spilled some wine. OK Google, how do you get wine stains out of a rug?
Google: According to Patch, mix one quarter tea spoon of liquid dish washing detergent with four cups of lukewarm water and pour a small amount of the solution on the stained carpet or the solution into the stain by using a gentle blotting motion, then use water and a clean cloth to blot the spill and it should remove the rest of the stain.
Rishi Chandra: Well very thorough in your answer. So this is really really hard and it’s only something Google can do, and we are constantly improving it. And as search gets better, Google Home automatically gets better.
And the great thing is we can also bring Google’s understanding of local and traffic information, so you can more easily plan your day and schedule. So let’s say I’m planning a backpacking trip to Costa Rica and I need to get some camping gear. Well, I can ask Google Home for some help. OK Google, what are some nearby camping stores?
Google: I found a few places within 8.6 miles. The first one is The North Face on Post Street in San Francisco. The second one is Marmot on Post Street in San Francisco. The third one is REI on Brandon street in San Francisco.
Rishi Chandra: OK, let’s say I am going to go check out REI. OK, Google, how’s the traffic looking to REI?
Google: On your way to REI, there is a usual traffic. You might want to take the route via 10th Street since it’s slightly faster. It is 21 minutes by car.
Rishi Chandra: OK Google, how long will it take to bike there?
Google: Biking to REI will take you 24 minutes.
Rishi Chandra: I think I will go ahead and bike there.
This is all built on years of investment in Google Maps. And over time, we’re going to make it easy to send the directions right to your phone so you can pick up right where you left off. And of course, Google can teach me the most important thing as I prepare for my trip to Costa Rica. OK Google, how do I say I would like a beer please in Spanish?
Google: [Spanish]
Rishi Chandra: I think I will be using that a lot and I’m not sure I can say it the same way. The Google Assistant is there to help you when you need it. Finally, you have an assistant that can bring the knowledge of Google, just by asking for it.
Third, Google Home helps you manage everyday tasks. Home is where we love to relax, but also where we want to get things done. Now we want Google Home to be able to help you stay on top of your busy schedule. So we designed a feature called “My Day” that, with your permission, summarizes important topics and activities for you in a really simple way. It’s great thing to try with the morning coffee. Let me show you. OK Google, good morning.
Google: Google morning, Rishi. It is 7:32AM. The weather in San Francisco currently is 59-degrees and cloudy with a high of 65-degrees. Your commute to work is currently 59 minutes with moderate traffic if you take US 101 south. Today at 5:00 pm, you have Bollywood hip-hop dance class.
Rishi Chandra: Don’t judge me.
Google: Remember to cook dinner tonight for the kids. Have a good one.
Rishi Chandra: It’s really great. If the President can get a daily briefing, why shouldn’t you? And now you can your plan day without skipping a beat. Google Home also excels at assisting you with common tasks in the home. So for example, when planning your dinner, you can easily add things to your shopping lists, get nutrition information on ingredients, and set up multiple timers to manage your cooking times. All with your voice. OK Google, can you add kids’ vitamins to my shopping list?
Google: Okay. I have added kids’ vitamins to your shopping list.
Rishi Chandra: And now when I am at the grocery store, I can just pick up my Pixel and quickly access the shopping list. OK Google, show me my shopping list?
Google: Here you go.
Rishi Chandra: It’s one assistant across your devices. Now of course helping with everyday tasks goes beyond just the kitchen. We want to help you in all aspect of your home life. For example, Google Home can even be the ultimate arbitrator between all those family debates. OK Google, flip a coin?
Google: Okay. It’s tails.
Rishi Chandra: We want to go beyond only what Google can do and also allow you to use your voice to interact with your favorite apps and services. We’ll have more to share about how Google will have partners integrate with the Google Assistant little later today.
OK, Four, Google Home helps you control devices at home. As homes get more connected, we think voice is going to become a standardized way which you can interact with your home devices. It’s just really easy. So for those with smart home devices, we have partnered with Nest, SmartThings, Philips Hue, and IFTTT to voice enable your lights, switches, and thermostats. You can control individual devices, rooms, or your entire house. And we will be constantly adding more partners over time.
We want to make sure we support as many smart home devices as possible. Fortunately, if you are one of the 30 million people who have purchased a Chromecast or Chromecast Audio, you can now use Google Home to voice control your TVs and speakers. It’s awesome. So let me show it to you. OK Google, play Katy Perry’s latest music video on my TV.
Google: All right, Rise from YouTube playing on Chromecast.
Rishi Chandra: It can completely change how you watch television. When you know what you want, just ask for it. And this is truly hands free voice control – there is no remote, no voice button that you have to go press, no phones needed. You just need to have a Chromecast and Google Home will automatically work with it. You can also use your voice to control playback. OK Google, pause. And if you need to, you can still pick up your Android or iOS device and continue to control it or update the playlist. It is all synchronized across your devices using Google Cast. And now you have a totally new way to experience YouTube on the TV. It’s a lot of fun. OK Google, watch videos of John Oliver on my TV.
Google: Sure, p laying John Oliver from YouTube on Chromecast. Hello there, I’m John Oliver, host of that show you have heard some things about but haven’t gotten around to watching yet. You think it’s called last night this week or maybe yesterday right now. Don’t worry. You’ll Google it later.
Rishi Chandra: I love that joke. Now we’re initially launching this feature with YouTube. It’s early days, and we’re going to continue to improve it over time as we add more partners to support voice casting. In fact, we are excited to announce that Netflix will soon support voice casting via Google Home. So you will soon be able to say: OK Google, watch Stranger Things on my TV.
Google: OK, Stranger Things playing from Netflix on Chromecast.
Rishi Chandra: It’s really really easy. It’s never been easier to binge watch your favorite Netflix shows. But why stop at just music and video? We are also working with Google Photos to make it easy to show your favorite photos on the TV. OK Google, show my photos from Sarah and Ajay’s Wedding on my TV.
Google: Show your photos from Sarah and Ajay’s wedding on your Chromecast.
Rishi Chandra: Just works, hands free voice control combined with Google Photo’s awesome image search capabilities, means you can literally throw any photo on the TV in a really easy way. OK Google, show my photos of dancing on my TV.
Google: Show your photos of dancing on your Chromecast.
Rishi Chandra: Now you know why I was training with that Bollywood dance class. It’s really easy. We think you’re going to love using this. We also support voice casting to your speakers. So if you already have a high-end audio system, just plug in a $35 Chromecast Audio and Google Home will be able to voice control your speaker system without any wires or cables. And all of these features will also work with any Google Cast-enabled TV like Android TVs or Google Cast enabled speakers from our OEM partners. So your options will only expand over time. Once you see how easy it is to use Google Home, we think you’re going to want to put one in every room in your house. So we made sure to design Google Home so they can work better, together. For example, we have enabled multi-room audio support across Google Home devices. This allows you to play group or create a group and play the same song across the devices at the exact same time. So now you can fill your entire house with great amazing audio. You can even add a Chromecast Audio or Cast speakers to the group.
We also need to be smarter about how we respond if you have multiple Google Homes in your house. You don’t want all of the devices speaking responding to you at exact same time. So we designed the Google Assistant to be context aware, only the device that hears you best will respond. It will even respond intelligently across your Android devices and your Google Home devices. So that you don’t have to worry about it. Just ask your question and the right device will respond.
So that’s a quick summary of Google Home — an amazingly Wi-Fi speaker with the Google Assistant built right in. And as the Assistant gets continuously better, so does Google Home.
Now let’s talk pricing and availability. Google Home is going to be available for $129. And to make sure you have a chance to experience Google Home at its full potential, we will be offering a free six-month trial of YouTube Red. So you can get access to YouTube Music, Google Play Music, and ads free YouTube. Google Home will be available for pre-order starting today in the US from Google Store, BestBuy, Walmart, and Target. We will be shipping and available in retail stores on November 4.
For those customizable bases, we have six really exciting options to choose from: Our vibrant fabric bases come in 3 incredibly fun colors — Mango, Marine, and Violet. Our sleek metal bases come in Carbon, Snow and Copper. We can’t wait for you to try them out.
Next up is Scott Huffman who is going to talk to you about how partners can integrate with the Google Assistant. Before that, I want to debut new our Google Home spot. Enjoy!
[Video Clip]
Scott Huffman – Engineering Director at Google
Hi everyone. I’m Scott Huffman. And I lead Engineering for the Google Assistant. Today you’ve heard how the Google Assistant lets you have a conversation with Google. It helps you get things done across all the devices and contexts in your life.
Google has always been about helping people, and people come to Google many times a day when they need assistance. From checking the weather to researching medical information, people come to Google for help with every imaginable task. So really, the Google Assistant is a continuation of the company’s focus on helping users. It’s what we’ve always been about, and it’s what people everywhere know us for.
But to do this really well, the Assistant needs to work with many partners, across many kinds of devices and contexts. Going back to Google’s earliest days, we’ve always worked hard to create healthy, open platforms. Search, of course, is the first example. Building on top of the web, Search helps users find great content and it helps publishers find their audience. We’ve also built vibrant platforms in the Ads space, in Maps, for Android and Play, YouTube and many other areas. These open platforms are useful for people everywhere and they provide a lot of value for partners. The Google Assistant will be our next thriving, open ecosystem.
Of course, many of Google’s existing partners will automatically be able to engage people through the Assistant. Right out of the gate, the Assistant will be able to find information about local businesses, surface great content from YouTube creators, and recommend web content from publishers. It can even offer help in the form of snippets from trusted websites with clear attribution, and deep links into the apps I have on my phone, like you saw in Brian’s demo of the Google Assistant on Pixel.
So think about that for a moment. On day one, the Google Assistant can bring users together with partners that can help them. But in many cases, I’m not just looking for information. I know what I want to get done, and I just want my Assistant to do it for me. Today I want to give you a sneak peek at the open developer platform that will let anyone build for the Google Assistant, and we’re planning to launch this in early December.
Actions on Google will let many kinds of partners complete tasks for users through the Google Assistant. And there will be two kinds of actions: Direct Actions and Conversation Actions. Sometimes a person’s request is so straightforward that the Assistant can just trigger the right partner action instantly. We call these Direct Actions and you saw some of these with Google Home earlier.
Direct Actions are great for things like home automation, media requests, and communications. When I say “Turn on the living room lights”, the Philips Hue or SmartThings lights should just come right on. When I say “Play my dinner party playlist on Spotify,” music should fill the room. Direct Actions expand on how partners have already created Voice Actions for Android.
But some things that I want to do take a little more discussion. Even something as simple as booking an Uber takes some back and forth. Conversation Actions will be for these kinds of tasks. When I say “I need an Uber,” my Assistant will be able to bring Uber right into the conversation. Then Uber could say, “Where would you like to go?” and I can respond with my destination. Then Uber might ask, “Would you like an UberX again?” and maybe I’ll say well this time we need an UberXL. Once the ride is confirmed, Uber can say, “Your driver is Betsy, and she’ll arrive in 3 minutes in a black Chevy Suburban.” We want to make it really easy so everyone, developers to local businesses can create these kinds of conversations.
Partners will be able to just tell us what kinds of requests they can handle from ordering groceries to playing a game to anything in between and then build out their conversational experiences. Now thousands of experts and novice developers have already built conversational interactions with API.AI and these can become Conversational Actions for the Google Assistant, and we’ll support other conversation-building tools as well.
The real beauty of actions on Google is how they’ll scale. Farther into the future, they’ll work in purely voice-enabled experiences, on text-based experiences, and on hybrid interfaces. This will let partners reach people through the Google Assistant everywhere. Now of course, actions aren’t very useful unless they’re easy for users to find. People shouldn’t need to pre-enable skills or install new apps. We think this is an important dimension. Just ask the Google Assistant for what you want, and Google will find the right kind of help for you.
We’ve been hard at work with many developers, publishers and businesses that have helped us refine actions on Google. So the partners shown behind me — across areas like home automation, music and entertainment, news, transportation and many other categories — are building things for the Google Assistant. I’m really excited to share more about Actions on Google in December. But in the meantime, you can visit developers.google.com/actions to learn more or to sign up for news and updates.
Now, as we’ve been developing the Google Assistant, we’ve been asked by many device makers if they can incorporate the Google Assistant into their products. We imagine a future where the Assistant will be ready to help in any context from any kind of devices. So to make this a reality, we’re developing the Embedded Google Assistant SDK. So whether you’re tinkering with a Raspberry Pi in your basement or building a mass market consumer device, you’ll be able to integrate the Google Assistant right into what you make. This SDK will launch next year, but we’re already working to put the foundations in place. So I will have more to share on this in a few months.
Today marks an important moment for Google, it’s an inflection point created by incredible advances in machine learning, the power of the Knowledge Graph, our diverse ecosystems, and the magic that’s possible when the best software meets the best hardware. With that, let me invite Rick back to the stage.
Rick Osterloh – SVP, Hardware Group
Thanks, Scott. As you just heard, the Assistant will continue to get better over time. And as you saw today, the phone and Google Home are the first two devices that showcase the Google Assistant. We’re deeply committed to building hardware that brings this vision to life. This is only the beginning. We can’t wait to hear what you have to say about our newest devices. And if you want to know more, you can check it all out at our website, here
And with that, I’d like to say thank you and good bye to everyone joining us on the live stream.
Related Posts
- Dark Side of AI – How Hackers Use AI & Deepfakes: Mark T. Hofmann (Transcript)
- Sam Altman: The Future of OpenAI, ChatGPT’s Origins, and Building AI Hardware (Transcript)
- Transcript: Perplexity CEO Srinivas on Winning Search With AI
- Elon Musk: Digital Superintelligence, Multiplanetary Life, How to Be Useful (Transcript)
- Transcript: Sam Altman on AGI, GPT-5, And What’s Next — the OpenAI Podcast Ep. 1