Skip to content
Home » Facebook CEO Mark Zuckerberg’s Keynote at F8 2017 Conference (Full Transcript)

Facebook CEO Mark Zuckerberg’s Keynote at F8 2017 Conference (Full Transcript)

Mark Zuckerberg

Here is the full transcript of Facebook F8 2017 – the company’s annual developer conference keynote featuring Mark Zuckerberg, Mike Schroepfer, Deb Liu, Rachel Franklin, Ime Archibong, and David Marcus. This event occurred on April 18, 2017 at the McEnery Convention Center in San Jose, California.


Speakers at the Event:

Mark Zuckerberg – CEO, Facebook

Mike Schroepfer – CTO, Facebook

Deb Liu – Director of Platform Products, Facebook

Rachel Franklin – Head of Social VR, Facebook

Ime Archibong – VP of Product Partnerships, Facebook

David Marcus – VP, Messaging Products, Facebook

Mark Zuckerberg – CEO, Facebook

Hi everyone. Welcome to F8!

We are gathered here at the second biggest event called F8 this week, and we probably should’ve seen this one coming after — probably should’ve seen this coming after Fast and Furious 7, and didn’t.

Now while we don’t have the rock here today, we do have the tech equivalent. David the rock Marcus! And while we may not live our lives a quarter mile at a time, I know at least some people here live their lives one quarterly earnings at a time.

All right. Bear with me, I got one more — one more for you. All right. While Fast and Furious’ tagline is “never give up on family”. Ours is similar: “Never give up on the family of apps”. All right. Not as catchy, not as catchy.

I could just keep going. I wrote like six more of these, but I understand that some of you are here to see a tech keynote. So let’s get to it.

So you may have noticed that we rolled out some cameras across our apps recently, that was Act 1. Photos and videos are becoming more central to how we share the texts, so the camera needs to be more central than the text box in all of our apps.

So today we’re going to talk about Act 2: where we go from here. And it’s tied to this broader technological trend that we’ve talked about before: augmented reality. Now before we get into that, last month I wrote a letter on building community — I have it here. And it’s long, it’s like 6000 words and, you know, I’m not sure if all you guys got a chance to read every word of it, so I figured maybe we just start by reading it to you right now.

All right. In all seriousness, this is an important time to work on building community, and we live in a time when society is divided and we all have a lot of work to help bring people closer together. And when we talk about this divide, a lot of us talk about the economic issues. But I think a bigger part of the solution is social as well. We all get a lot of meaning from the communities we’re part of, and whether they’re companies or churches, sports teams or volunteer groups, they give us a sense of purpose. And this feeling that we’re a part of something bigger than ourselves, that we’re needed, and that we’re not alone. So these groups make up our social fabric, and that’s why it’s so striking that membership in all these groups has declined so much over the last few decades. Since the 1970s, membership in all kinds of different local groups has gone down by as much as a quarter; that’s a lot of people who now need to find a sense of purpose somewhere else.

For the past decade, Facebook has focused on connecting with friends and family. And now with that foundation, our next focus is building community. We’ve always done a lot of work to help people share and get a diversity of opinions out there. And we’re always going to do this. But now in addition, we’re also working on building common ground, not just getting more different opinions out there but also helping to bring people closer together. And there’s a lot to do here.

We have a full roadmap of products to help build groups and community, help build a more informed society, and help keep our community safe. And we have a lot more to do here, and we’re reminded of this — this week by the tragedy in Cleveland. And our hearts go out to the family and friends of Robert Godwin Sr. And we have a lot of work and we will keep doing all we can to prevent tragedies like this from happening.

Now, since this is F8 — our developer conference — today we’re going to focus on the technology that we’re building together for the long term. Because in the future, technology is going to keep — make us more productive and that’s going to change how we all work. It’s going to free us up to spend one more time on the things we all care about, like enjoying and interacting with each other and expressing ourselves in new ways.

In the future, I think that more of us are going to contribute to culture and society in ways that are not measured by traditional economics and GDP. Or more of us are going to do what today is concerned the arts. And that’s going to form the basis of a lot of our communities. So that’s why I’m so excited about augmented reality, because it’s going to make it so that we can create all kinds of things that until today have only been possible in the digital world and we’re going to build to interact with them and explore them together.

So at last year’s F8, we talked about our 10-year roadmap to give everyone in the world the power to share anything they want with anyone. And one of the key long term technologies that we talked about is augmented reality. Now we all know where we want this to get eventually, right? We want glasses or eventually contact lenses that look and feel normal but that let us overlay all kinds of information and digital objects on top of the real world. So we can just be sitting here and we want to play chess — Snap! Here’s a chessboard and we can play together. Or you want to watch TV, we can put a digital TV on that wall and instead of being a piece of hardware, it’s a one dollar app instead of a $500 piece of equipment.

So think about how many of the things that we have in our lives actually don’t need to be physical, they can be digital, and think about how much better and more affordable and accessible they are going to be when they are. So think about going to Rome on vacation and having information about the Colosseum overlaid on the actual building or directions overlaid on the actual street. And think about if your daughter is a big Harry Potter fan, for her birthday, you can change your home into Hogwarts, although I bet some of you were hoping I had the toilet paper [money].

Now we’re all about extending the physical world online. When you become friends with someone on Facebook, your relationship gets stronger. When you join a community online, that physical community gets stronger. So augmented reality is going to help us mix the digital and the physical in all new ways and that’s going to make our physical reality better. So that’s why this is such an important trend.

Now when we talk about augmented reality, there are three important use cases that we think about: the ability to display information, like directions, or messages and notifications; the ability to add digital objects, like the chessboard or the TV screen I was talking about; and the ability to enhance existing objects, like your home or your face.

Now I used to think that glasses were going to be the first mainstream augmented reality platform, and that we get them, you know, maybe five or ten years from now, we get the form factor that we all want. But over the last couple of years, we started to see primitive versions of each of these use cases on our phones and cameras. So for displaying information we’ve all seen people take photos or write text on them or circle things or draw arrows to highlight information. For digital objects, we have games like Pokémon where you can overlay a digital Pokémon on top of the real world in front of you. And for enhancements, we have things like face filters and style transfers to make our images and videos more fun.

Now a lot of people look at the stuff and it seems so basic, right? And you ask, you know, maybe this is just what kids are into doing these things. But we look at this and we see something different: we see the beginning of a new platform. We’re not using primitive tools today, because we prefer primitive tools. We’re using primitive tools because we’re still early in the journey to create better of these. And in order to create better tools, first we need an open platform where any developer in the world can build for augmented reality without having to first build their own camera and get a lot of people to use it. But when you look around at all the different cameras that are out there today, no one has built a platform yet.

So today we’re going to start building this platform together, and we’re going to make the camera the first mainstream augmented reality platform. So if you take one thing away from today, this is it — right here: We’re making the camera the first augmented reality platform. So for those of you want to just roll out cameras across all our apps and you wonder what we might have been doing, that was Act 1. This is Act 2: giving developers the power to build for augmented reality in the first augmented reality platform: the camera.

All right. Let’s take a look at what this is going to look like. All right. So you’re going to be able to swipe to the camera and you’re going to start discovering effects that your friends are using and that are relevant to the place you’re at nearby. And you’re going to be able to scroll through all the effects and we have a lot of them.

Now we’re going to start today with all the basic effects that you’re used to: face masks, art frame, style transfers. Now since this is an open platform, you’re going to build to create your own and instead of having maybe ten or twenty options to choose from, you’re going to have thousands of options from creators all over the world from all different kinds of cultures and backgrounds and styles. And this is launching in Beta today. Now this is the first step, though. So we have a lot crazier stuff that I want to show you that’s going to be coming soon.

So now for real augmented reality, you don’t just want the ability to do those tools. You also want the ability to have realistic 3D objects and in order to do that, you need to have a platform that has — that gives them precise location, a realistic relationship with objects around them in their environment. So there’s an AI technique for doing this, called simultaneous localization and mapping, or SLAM, for those of you in the AI community.

And here’s how this works: so you’re going to build to easily create anything you want; you can write a fun message next to your breakfast and you can build a slot on to the table. And since we understand the depth of the table, you saw it was a clue, did in the right place as it came up and you’re going to build to pan around and it’s going to maintain its position on the table exactly as if it were a real object in the world. So we make this more fun, we’ll add some breakfast sharks swirling around my ball, some clouds, and you know, there you go. It’s got the depth right, so when they go behind the ball, that they’re included, it gets the depth of the table and all that.

All right. So there’s some pretty bulk AI work to make this all work and we can do this on a phone. So now — but we’re just going to start, this is the first one that I want to show you. These are the technological foundations for advanced AI. Let’s go to the next one.

Now since we’re mapping out these scenes in 3D, right? Since we have the depth of an image, we can go from taking a still photo to mapping out a whole 3D scene. So this actually was taken from a 2D still image in our office in Seattle. And from the still photo we constructed a 3D scene. And now because it’s a 3D scene we can pan around. How crazy is that? Crazy. All right. We can change the lighting, we can turn the lighting down, we can move the lighting from the front of the room to the back. And you can add all kinds of effects — we can fill the room up with water if we want; again it’s got the depth right, you can add a lot of bouncy balls – we’re fond of bouncy of balls. And we could fill the room up with skittles, because the future is delicious. I can’t help it.

All right, Now, look, we also have some of the best computer vision and object recognition work in the world, so that’s going to help you identify different things in the scene to help you surface relevant effects that you want to check out. So you’re going to build the top on the coffee mug and we’re going to surface effects that are relevant to coffee, you can have steam. You can add a second coffee mug, so it looks like you’re not drinking — you know, you’re not having breakfast alone. And since it’s a digital object, you can you make it bigger if you want or smaller; you can make it any size you want. You can top on a plant. You can add flowers that are blooming; you can water the plant with a raincloud or whatever else it is that you do with plants I guess. You can tap on the wine bottle and you can add an information card, where that shows what the vintage of the wine is, and you know what the rating is, and maybe where to get it or maybe in the future you’re going to like to buy it. So some of these effects are going to be fun and others are going to be useful.

So in augmented reality, you’re going to be able to do all this different stuff. So these are three of the technological building blocks for building augmented reality: precise location, 3D effects and object recognition.

But now that we’ve gone through some of the technology, let’s take a look at some of the use cases that you’re going to be able to build and do. All right. So let’s start with the most basic. This is an AR tool that Nike created for helping you visualize and share your runs in some fun new ways. Now in augmented reality we’re going to face near constant reminders that Ime is in better shape than us. Ime is probably out for a run right now before he comes on stage in twenty minutes.

Now, look, we all do things like this every day, like running — everyday things, running, cleaning our home, doing laundry, and changing diapers. And the reality is we’re proud of these things where we want to express them but we often don’t get a chance to do that because they feel mundane, and it doesn’t feel special enough to share it. So too often we don’t share them unless we have something that makes that moment funny or feel like it’s going to be relevant to other people. So as silly as effects like this might seem, they actually are really important and meaningful because they give us the ability to share what matters to us on a daily basis.

Let’s talk about games. Now more than a billion of us play games, and I think that there’s going be a whole new genre of augmented reality games coming. So here’s an example of father playing an augmented reality game with his kids in the waiting room at the doctor’s office, where he’s using the table in the waiting room as the game board for a tower defense game and the kids can kind of slap the bad guys before they get to you. Now there’s going to be a lot of awesome stuff like this that comes and I’m pretty excited about this – this part of the platform is going to come a bit later this year.

Let’s talk about art. Now, with augmented reality you’re going to be able to create and discover all kinds of new art around your city. So you know, this is actually — this is a piece at Facebook headquarters and without augmented reality this actually just looks like a blank wall. But when you are in augmented reality, you get this beautiful piece of art; it’s not just a painting on the wall but it fills up the whole space; it’s 3D. And not only that, it’s going to be impossible to build or make in reality because you have this infinite waterfall of paint coming down and it’s really quite something to look at.

Now one thing that’s kind of a funny side effect of this, is that — now we’re on Facebook, we noticed that there are just people gathering around looking at blank walls. So you know, this is going to be a big in the future. I mean, it’s just people just kind of sitting around and staring up a blank wall. So we have to put a physical plaque on this wall to commemorate that this is — we think this is going to be one of the first pieces of augmented reality street art in the world but there’s a lot more like this coming.

Now one of the things that I’ve always wanted to do is leave notes for friends in different places. And in order to do this, you need to have a really precise sense of location. So this isn’t just about finding a Pokémon within a one block radius. I mean, you need a very exact location, and so I am talking about sharing a note to tell your friend what the best special is right next to the sign of the specials at a restaurant, or marking your table at the local dive bar that you go to with your friends, or leaving a note for your wife on the refrigerator. And some of the stuff I think is going to be really special.

All right. So those are just a few of the examples of what we’re doing for this augmented reality platform. And like I said, it starts in closed beta today. I want to be really clear and set expectations that it’s going to take a while for this to develop. There’s a lot in here that we’re going to roll out over time and your experience is going to change dramatically overnight. It’s going to take awhile to roll some these things out and then even longer for developers to actually start building all these experiences. But over time I do think that this is going to be a really important technology that changes how we use our phones and eventually all of technology. And this is a technology — this is the kind of technology from that we love to sell and build, because there’s a long roadmap of technology to build for years. Even if we were a little slow to add cameras to all of our apps, I’m confident that now we’re going to push this augmented reality platform forward. And long term all the work that we’re doing here is going to go into glasses that we all want, it’s all the same technology and this is another step on the path there.

All right. Now we have a lot more to talk about over the next couple of days that relates to the ten-year roadmap to give everyone the power to share anything they want with anyone. We’re going to talk about AI. We’re going to show you some of the AI work that goes into making this augmented reality platform work, and a lot of the other stuff that we’re building. We’re going to talk about virtual reality, and we’re going to launch our first social virtual reality product.

Virtual reality and augmented reality go hand in hand. And this virtual reality experience is going to give you a taste of what it’s like to have this real sense of presence with your friends no matter where they are in the world and to start interacting with all kinds of digital objects on the road to fully augmented reality.

We’re also launching the next-generation of Messenger platform. We already have AI that helps businesses answer all kind of questions from their customers and today we’re going to launch a bunch of Discover tools that are going to help you find the businesses and bots that you want to interact with on the platform.

Tomorrow, we’re going to update you on all of our work around connectivity. We have our team in Arizona right now preparing for the second flight of Aquila, our solar powered plane, and that’s going to help beam down Internet connectivity to people all around the world. And we’re going to update you on a lot of the other technology that we’re building, too.

And you’re also going to hear from Regina Dugan about some of the work that we’re doing in building even further out beyond augmented reality, and that includes work around direct brain interfaces that are going to eventually one day let you communicate using only your mind. Now that stuff is really far out, but share some pretty interesting stuff as we’re going to talk about tomorrow.

All right. So that’s what we’re working on. We’re building tools to give everyone the power to share what they want and I’m really excited about this augmented reality platform and all the stuff that we’re going to build to create with it. And this is an important time for us to work on technology like this, because we all have a lot of work to build community and bring people closer together. And as always, it is an honor to be on this journey with you.

So thank you all for coming out and now I’m going to hand it off to our Chief Technology Officer Mike Schroepfer to talk about AI, the augmented reality platform. Have a great F8!

Mike Schroepfer – CTO, Facebook

Hi everyone. All right. Mark showed you some really compelling examples of an AI-powered AR camera. What I wanted to do is pick out what you got for a few minutes and pop the hood and talk to you about the technology that powers these experiences. What enables all of these experiences on a phone today is a revolution in computer vision, subfield of artificial intelligence. You can trace today’s changes in computer vision all the way back to the 1990s.

This is Yann LeCun, the director of the Facebook AI research lab where he revolutionized computer vision in the 1990s. He replaced years of hand tuned models, hand-tuned heuristics with a learning network, a convolutional neural net that would learn the weights and how to figure out what to extract from this image, simply by looking at the pixels in that image. Now it took almost two decades for the entire community to figure out how to apply this technique: convolutional neural nets to the task of image and object recognition. That’s in 2012 when AlexNet won the competition called ImageNet at 55% accuracy in detecting objects. And over the next four years, we saw a astonishing rate of improvement from 55% accuracy to 80%.

To give you a sense of this qualitatively, let’s look at the same image, run through state-of-the-art computer vision networks from 2012, 2015 and today. So on the left, AlexNet — this 2012 network would only be able to tell me some really basic information, like there’s a person in this photo, not totally useful for real world applications. By 2015, we started to get not just object identification but a sense of localization: So where in this image are the different objects? But if you look closely, you can see it’s nowhere near perfect. That purple line there, that’s supposed to be a refrigerator and most of us know refrigerators are rectangles, that’s not really a rectangle, right?

Just last month, the Facebook AI research team released a new state-of-the-art network called Mask R-CNN, performs better on any existing benchmarks than any other published systems out there. You can see this qualitatively, see the nice SharpMask around the different objects in the scene. But that’s not all it can do. We can also get a 17-point key point detection to detect not just where a person is in the scene but exactly how they are posed and oriented in that scene. So with this sort of technology, you can start to enable new ideas, like being able to mask out the background even when you have someone moving through it at high speed.

Or, let’s say, I took a awesome vacation video and a windsurfer rudely interrupted my view. Not a problem! A little computer vision on my side. So that’s Step 1. Our research teams are pushing the state-of-the-art of the bounds of computer vision and AI and we have some of the people who founded this in the 1990s and are pushing the state-of-the-art today.

But Step 2 in building this AR camera, is taking these algorithms which often require big beefy servers that require hundreds of watts of power to run and train and getting them squished down to run on your cell phone in your pocket. Our first test case for this was a technique called style transfer. It’s a fairly cool but straightforward technique. We take the style of a painting, say, a starry night and you paint it onto an image. To give you a sense of the progress we’ve made on this over the last year, our engineering teams took an algorithm that could barely run on a big desktop machine and have got it running up to 30 frames per second on the phone in your pocket. That’s several orders of magnitude improvement.

To give you a sense of this qualitatively, this is where we were on the left last summer. This is style transfer, you can’t see it’s just like little 128×128 block, that’s supposed to look like a Lego, actually looks pretty bad, there’s a lot of noise in the scene. On the right is what you have in your pocket, on your phone today with Facebook and Messenger, beautiful 720p real time style transfer.

To give you another sense of the state-of-the-art of progress, I showed you earlier in this talk Mask R-CNN state-of-the-art computer vision algorithm released last month by research teams. We shot this video this morning of that same algorithm running on a phone at five frames per second. So we’re doing this key frame matching where you can see the exact pose of each of the individuals in this scene. So our teams have gotten really good at rapidly taking state-of-the-art technology and putting it in your pocket in your cell phone.

You’ve seen other releases like last week when Instagram released sticker printing where you can drop a sticker in the world and create, for example, virtual Easter egg hunt. But as Mark talked about there’s many foundational technologies, that example of Instagram is reach and tracking. I showed you style transfer, image recognition, localization but one of the key techniques here to power many of these augmented reality experiences, it’s called SLAM, or simultaneous localization and mapping.

Now to do this for real, to get a real 3D map of the world is very hard. Luckily our teams on Oculus VR have been working on this problem in the context of VR. Last fall, we demonstrated a prototype headset from the Oculus team called Santa Cruz and the idea of this headset is it’s a standalone VR unit, does not require a cell phone, does not require a PC, no special hardware installed in your room, you just grab the unit, put it on your head, you’re instantly in VR.

Now the thing that makes this work is actually SLAM, because this device has four cameras on it and as I’m walking through the real world, those cameras are tracking my emotions in the real world, translating those emotions into virtual world to make sure that that virtual world perfectly matches my movements. Now doing this in a VR context is really hard and completely unforgiving. If we get this wrong, your virtual world doesn’t update and you break immersion immediately. So we’ve got this technology working rather well in this context and we use the same techniques we used to optimize style transfer, region-tracking, the Mask R-CNN and apply it to SLAM. And this is what Mark showed you.

So when I’m moving my camera through this view, we’re getting a perfect track of where that camera is in relation to the room. We’re also reconstructing the geometry of the room, so we know that that table is a plane and a rough geometry of that ball. So as I place virtual objects in that scene, they’re perfectly placed and stay pinned to that scene, even if I move the camera out of the view and back into it, because we’re actually building a map of that scene, you’ll still see the objects placed in the right location.

Now I could gig out with you for like an hour longer. But we got other people to come talk to you today about lots of amazing technologies. So I’m going to have to stop it here and on Day 2 we’re going to talk in a lot more detail about all the other technologies we’re bringing to bear in AI to put this power in your pocket. And as excited as I am about all of this technology and even more excited to get it in the hands of you — our partners and developers — and to talk about how we’re going to do that, I want to welcome Deb Liu to the stage. Thank you everyone.

Deb Liu – Director of Platform Products, Facebook

Thanks so much, Schroep. Mark came out here this morning and talked about how camera is the first augmented reality platform, and Schroep went on to describe how AI is powering this new augmented reality and this future is coming. But taking a look back ten years ago, we launched Facebook as a platform.

We knew we could not connect the world alone and so we invited you, the developer community, to join us and work with us. And today I’m going to share the next platform we’re building and how you can be a part of this AR world.

We recently launched a new camera that made sharing photos on Facebook even more fun. Today we’re widely releasing the Camera Effects platform. This gives people, artists and developers powerful tools to create frame and effects. The Camera Effects platform actually has two tools. The first is Frame Studio. Frame Studio gives anyone with creative skills the ability to make fun photo frames to share on Facebook and it’s available today globally. To see this, we put Frame Studio in the hands of artists all over the world and they build frame for their local communities, so you visit a new place you can actually see these frames from the artists there.

The frames can make even every day photos more engaging. I posted a photo of Bethany on her birthday recently. By adding frames to this, this annual post becomes just a little bit more special. And now I can post a video with a personalized frame to make it even more engaging. And by adding augmented effect, I can make her live birthday video even more meaningful.

This new content type brings AR to everyday life. It connects art and technology to create new immersive experiences and these are just a few of the effects we’ve created for camera so far. With the Camera Effects platform, artists and developers can create so much more.

And we’re excited to announce a second tool: AR Studio that makes this all possible. But imagine you had to build all of these things on your own, imagine the kind of large engineering and design team you would have to bring together to make that possible. But AR Studio simplified all of this. It allows you to create animated masks and interactive effects that respond to actual motion and data.

So let’s take a look at a couple of my favorites. This isn’t widely known but I’m a gamer and yes, my gamer tag actually is [debanator]. Here’s an example of EA PC and console game Mass Effect Andromeda, something I spend way too many hours on or any, it uses our camera to build an immersive experience to talk about my gameplay. Let’s take a look behind the scenes at AR Studio. With the real-time face tracking you can layer 3D masks to fit any face and you can have that mask actually respond to facial motion. So when a [mouse sensor had moved], you can have it automatically respond without writing a line of code. And we flip the camera, you can see stats for my latest mission and we use using the scripting API to create a leaderboard and dynamically put it in 3D space.

On top of that you can pan the phone and actually using sensor data you can experience the game visuals in augmented reality. EA used AR Studio to make this engaging effect possible. For you soccer, or the rest of the world say football fan, I would love to show you these effects from Manchester United. It brings real time data from an actual match that’s happening and adds it to your video. So when Manchester United scores, you see the effects come up as goal and you hear the cheering, there’s confetti. So imagine having this effect available for your favorite sports team in the next game.

This even works on live video. Giphy is a way to express yourself through images. So they say an image is worth a thousand words and animated one must be worth 10,000. So you can take this effect and it does something really interesting; it actually responds to the interactions from viewers. So when someone takes a hashtag or votes by a poll, you can actually make it respond in the video itself. Giphy is making this available soon so make sure you try it out.

This is what we’re releasing today with AR Studio, but it is the very first step in a long-term journey and we’re just getting started. So let’s take a look at the future together. Mark painted a picture of how AR works on the camera, and Schroep talked about taking the amazing power of AI technology and adding it to that. But I’m going to share with you a vision of how we’ll realize this with the developer community so that you can code against the real world.

Imagine you’re at a local coffee shop. When people see the world, they can intuitively understand exactly what’s happening in the scene. When a camera sees the world, it sees things in 2D; it takes input and turns it into the pixel. That means making AR experience is really hard to conceptualize and bring to real life. But AI changes that. It turns 2D imagery to three-dimensional structured data the developers can build again. AR actually creates rich content so you can understand the surrounding. So in this case, we’re using a deep neural network to infer that you’re indoors and in a dining establishment. So we tell you you’re inside of a restaurant. Then using AI, we can identify the surfaces in the scene and we can turn them into structured data. We do this by using plain detection, where we’re looking for flat surfaces and we’re doing all of this in real time. So you can see we’re identifying the floor, the wall, maybe the front of the counter.

We can take this step further by actually recognizing, identifying objects and people in the scene itself. As Schroep mentioned we train our models of billions of pictures to help us match items to known objects. And so we can tell you where the coffee grinder, the plants and the people are and we also share confidence level so you can take that into account.

Now imagine all the possibilities, if you could take the structured data from any scene, combine it with your code and our tools. This will enable you to create an augmented reality experience that people can interact with in the real world, in real time. Here are some examples of what you could build. Imagine you could allow people to leave directions or notes for their friends. Or you can give people more information about where they are. Or let them drop digital objects that other people can find interact with right in that spot. All of this is possible because the camera turns the scene into structured data, you add your code and you can build this experience. We will enable this in our SDK in our tools in the months to come.

But there’s so much more, as Schroep said. We’re bringing hand tracking, body skeletal tracking, and so much new technology and we’re making it available to you, the developer community, through the camera platform. The journey to the future of augmented reality is just 1% finished and I can’t wait to see all the amazing and new experiences we build together. Thank you so much.

And now I bring up Rachel to talk about social VR.

Rachel Franklin – Head of Social VR, Facebook

Hi everybody. Earlier you heard Mark describe a future where virtual reality will transform the way we connect with our friends and our loved ones and our communities. VR is a technology that gives us something no other technology has before, a magical feeling of presence, the sense that we’re really there together even when we’re apart. Now because of this, VR is a naturally social platform and we’re building it with people at the center.

Today we’re going to show you the first Facebook product we’ve built to start bringing our vision to life. If you tried VR, you know how realistic it really feels. And if you haven’t, I’m going to give you a little sense. So here’s one example we see all the time. When people take a photo of their avatar in VR, with a selfie stick, they actually smile every single time. And one time during one of our early demos, we had this bug where the avatar hands stuck together. So two people could not stop holding hands, and incredibly they both found themselves blushing, and they got even joked that he needed to go home and tell his wife.

So even though you’re wearing a headset in VR, it’s still you in VR; you’re connecting with others in a more immersive way as an extension of who you are and the technology lets your humanity shine through. This is why VR has the promise to be the most powerful social platform and we’ve been exploring what this looks like.

Today we are releasing Facebook Spaces in beta. And starting now you can download it for Oculus Rift and Touch in the early access section of the Oculus store. Facebook Spaces is a place for you to be yourself in VR with your real friends no matter where you are in the physical world. And what’s really amazing in Facebook Spaces is that all the people in the photos and the videos that you interact with on Facebook are instantly brought into the virtual world. So the moment you sign in, you’re greeted with the friends that you see every day on Facebook and all of the things that are interesting to you.

But first, you need a virtual you. In order to have a meaningful social experience in VR, you need a life like Avatar that’s representing you, with expressions that help you relate your feelings to other people. Now this has been one of the big challenges for us to build. And our avatars have come a very long way since we last demoed to you at last year’s F8.

So one of the breakthroughs for us was to use machine learning to quickly suggest some options for your avatars identity based on your Facebook photos. So when you join Spaces, you’ll find your AI-generated avatar options waiting there for you to choose. Now, of course, it’s most important that you feel comfortable with your avatar and that’s a very individual feeling, which is why you can customize your avatar until it feels great to you. You can add glasses or facial hair; you can change your eye and hair color, and you can tweak some other aspects, too. This is the easiest it’s ever been to bring the real you into VR. So when your friends and family join your space, it’s just like really being together.

Now you need to invite friends into your space and you can invite up to three people to join you. Now once you’re all there, you’ll have interactive ways to have fun together. You can draw on the air with a virtual 3D marker to create anything you can dream of — from costumes to toys, to handmade game. Now as a kid, I wanted to be an astronaut; true story. So this is me living out my dream in VR and drawing my space helmet.

You’ll also have countless 360 videos based on your Facebook interest that you can experience in VR together. Now I may not have actually made it to space but I can tell you there is nothing like seeing the vastness of outer space all around you with your best friends. So whatever it is that captivates you, you’ll have the entire library of 360 photos and videos from Facebook at your fingertips, ready to just pull into your space and experience with your friends.

And, of course, there’s a selfie stick. So it’s really fun when your friends can be with you in VR. But what if they can’t be with you in VR? Well, we’ve integrated Facebook Spaces with Messenger video calls. So whether you want to show off your latest 3D drawing masterpiece, or play a video that your friend would love, you just call any friend who isn’t in VR and they can answer on their phone to instantly open a window into your virtual world. And I can tell you I’ve done this a ton of times, it is still amazing every single time.

Now we can’t wait to see what new things people will do with Spaces that were never possible before. My teammate Mike Booth – he has friends from college and they’re scattered all around the country and with Spaces they can have a virtual game night every week just like they used to. For me, it’s been amazing to give my friends at Facebook a 360 look at where I live near the ocean and I even introduce them to my crazy menagerie of pets at home without ever leaving the office.

We’re sharing a very early version of Facebook Spaces with you, so you can come along with us as we explore the beginnings of social VR. And today we should all begin to ask ourselves: when we have a platform that’s based around people, what will people want to do with their friends and family? We see three primary building blocks that form the basis of the social computing platform that we’re building: The ability to be yourself in VR. With the Facebook social graph your friends and family can be there, too. And because of VR technology, you can spend time with the sense that you’re really there together. And you have a space where you can create or bring in anything you can imagine so that you can spend quality time together. It’s sort of like a magical canvas for shared experiences.

This opens up a whole new way to think about building experiences that are more natural and more seamless and more interactive than anything ever before. There are still so many more possibilities for what activities and experiences people can have in VR together. So imagine if you had a virtual boombox that you could just grab and bring into your space and it would instantly play your personalized playlist in Facebook Spaces.

Or maybe imagine if you had virtual card games or tabletop games, maybe even a pool cue that you just bring into your space and it instantly transforms into a pool table. So eventually we’re going to add new ways for developers to build more amazing things on the foundation that you’re seeing today. We’re already exploring this with a few early partners and I have to tell you their creative inspiration is really incredible. We’re having a great time with it.

So now I want to show you Facebook Spaces in action.

[Video clip]

So please, I encourage you to go try Facebook Spaces today. We have demo booths out on the show floor so you can experience it for yourself.

And our team has had so much fun building Facebook Spaces. We’ve only just scratched the surface of social VR technology and we see a future when it will transform the way people around the world will stay connected with their communities and those people that they love. So we can’t wait to get there together with you.

And now please welcome our VP of Product Partnerships Ime Archibong who will talk about our developer communities. Thank you.

Ime Archibong – VP of Product Partnerships, Facebook

We’ll start off by clearing the air I did not run here today. I was right across the street, so I just walked. I want everyone to catch their mind back a decade ago.

It’s 2007! A fairly important year. Debut of the iPhone, Rihanna’s Umbrella ella was being played on every single radio station. We were just a three-year old start-up with a big crazy ambition to connect the world. It was also the year of our first F8 conference – our first big investment into this community, our community of builders.

It’s ten years later now — Umbrella is still being played on the radio. Close to 2 billion people are now using Facebook. And that small seed that we planted with 800 developers at that first F8 has now grown to almost 4000 alone in this room — hundreds of thousands around the world with 80% of you guys being located outside of the United States. And over a million people will be watching this live stream.

So the question I ask is, with all this community growth, why are we still here a decade later in this room together? The answer that I have is that every single one of us realizes as days go by more and more people realize that the opportunities and the complexities of the world are only becoming more challenging for us to solve and for us to actually ever build anything of consequence we have to figure out ways to invest in one another and build together to make it meaningful.

Now if I didn’t study computer science and electrical engineering, I probably would have studied economics. I’ve always loved the way that the economists in one very dense and concise article could demonstrate how a macroeconomic concept can have a profound impact on society. One concept that I’ve always been fond of is the multiplier effect, which is the idea that when an input changes like government spending, there is a larger output that also changes on the other side, like GDP. So, for example, if a government was to invest in building a road, that investment leads to easier transportation, which leads to distribution of goods, increases commerce and trade and ultimately leads to more investments and more growth.

Well, based on our track record over the last decade with this community right here, I actually think the same concept applies to us in this room. When we find the right and the appropriate ways to invest in one another and to build together, we end up touching more people, communities grow over time and with that changes, improvements and connections.

Over the last 10 years, our approach to investing in you guys and investing in this community has been two-fold: one has been through programs and one has been through building products. One program that you should all be familiar with is FbStart, it remains our global program to try to identify and invest and support high potential early stage startups. But our vision was never limited to just early stage startups. In fact, we want to invest in anyone out there who’s a potential multiplier.

So today I’m really pleased to announce that whether you are a student learning how to code for the very first time, or whether you’re an experienced coder who is looking to exchange technical best practices right now, at our new program you will find a meaningful community for yourself called Developer Circles, which was built to support local communities and the exchange of technical knowledge.

Now Developer Circles are led by local volunteers and community members like Innocent here. So Innocent grew up in Nigeria and became obsessed with video games and how they were built, which ultimately led him into programming. Just last May, Innocent launch Developer Circles Lagos and in less than a year about 1500 members have joined his circle. They’ve shared — they’ve had meet-ups on things like Facebook’s open source technology and also building bots for Messenger, it’s been pretty profound and exciting. As the lead, Innocent has the opportunity to use Facebook groups to keep his community connected and now through the use of news Facebook social learning units, he has the ability to tailor content plans for the specific interests and the technical needs and curiosities of his community.

We’ve been piloting this program for a little under a year now and actually the response has been tremendous. So I’d encourage anyone that is excited about Developer Circles to head to our developer website now and find a local community group to join right now.

So as mentioned, the other part of what we’ve been investing in this community has been through products. One long-term goal that we all have is to make sure that we are touching every consumer facing business out there in the world, which has required us to actually rethink what the definition of a developer could be.

These days more and more non-technical types, be it a musician, an NGO, an artist, a marketer, a business owner, are finding value in our products and using them to fuel their ambition. These folks too are multipliers and these folks too are part of this community. For example, Ryan Leslie is using our Messenger platform. As an independent music artist, Grammy nominated, Ryan’s ambition is to create a more supportive community of his fans. I met Ryan in 2015 when he was building SuperPhone, which is an intelligence platform that surfaces the most valuable conversations that he’s having with his fans. Since going independent and since leveraging technology like the Messenger platform, his fans have supported him in a way that — they’ve donated over — $3 million to him in direct support. Ryan is a multiplier in making it happen.

There’s also Chisenga Muyoya who is using our Free Basics platform. Chisenga’s ambition was to create a safer community for women across Zambia. In 2014 when I met her and her team at BongoHive Tech Hub in Lusaka, they were just watching WRAPP, an application to help Zambians understand women’s rights. That year, they launched in our Free Basics platform making their application available to millions of people across the world who have historically been unconnected to the Internet. Chisenga too is a multiplier.

Then there’s Josemando Sobral who is using – essentially, I say is, all of our platforms. As a co-founder of Colab, a Brazilian app dedicated to connect people with their cities, his ambition was all about creating a better civically engaged community. When I first met their team in 2014, they were just getting started. Since then they’ve used log-in, they’re building with React, they built a bot for messenger, they’ve even built on the Free Basics platform. And now their application is being used by over 100 cities. Josemando too is a multiplier.

And last but not least, twenty-one year old Saif Ahmed who’s using our Messenger platform. Based in Egypt, Saif’s ambition was all about creating a more informed community of faith by sharing personalized daily devotions. He recently launched Azkarbot as a bot for Messenger and already today he’s reaching over 650,000 people every single day with a daily devotion across the Messenger platform. Saif too is a multiplier.

Each of these stories demonstrate the multiplied impact that can happen if we’re actually investing in each other appropriately and building the right tools for each other. Let’s look at a few more examples of multipliers who are using our latest technology and our latest products.

Starting with Identity, now as you all know and I think Mark made clear there’s been some tension over the last few weeks of who truly own the F8 hashtag, right? So just for the purpose of these next few slides, I am going to ask everyone to refer to the movie THE FATE OF THE FURIOUS as the other F8. In all seriousness as you can see from the other F8’s fans on Facebook, it’s a very vibrant community of movie lovers across our family of applications. And Fandango, a business that allows movie lovers to discover and then also buy movie tickets has been using our identity platform for a number of years to connect with people across all their different experiences.

Just this past Sunday, I got the opportunity to see the other F8 with my brother and my sister, we talked about it on Messenger, I found the movie times via the Fandango bot on Messenger. I bought the tickets via the Fandango website and I got the tickets delivered on my app. When we want to help businesses like Fandango maintain those critical conversations relationships across all of our different product experiences and channels. So beginning today we’re making it easier for you to connect with people across apps and across bots on Messenger so that you can have a singular conversation with those critical, critical connections.

In addition, to be able to make sure that you can connect with people across your different experiences, we want to make sure that you can understand people across your experiences. So this past February when I was traveling in Ghana, I met a data savvy start-up called Asoriba. Asoriba has built a CMS, a church management system, and they can help churches better understand their congregation. And when I asked the team, given all the different opportunities that exist for you right now across the African continent, why are you spending time investing in Churches? Nana, their CEO quickly replied to me: “When you help the church, you help the community.” When he said that I immediately knew that they were multipliers. It’s been incredible to see them in their journey over the course of the last couple months where they have now built a stronger and more supportive community based on faith of 1000 churches and 90,000 church members across the African continent and across the United States.

Two years ago, at F8 we stepped up on this stage and we gave you a powerful tool for analytics for your apps. Over the years we’ve invested in making it more robust omni-channel solution that now includes support for Facebook Pages, include support for offline conversions.

Today we’re pleased to announce all these tools and all these offerings under one new name Facebook Analytics. The newest feature to Facebook Analytics is Automated Insights, which is a simple way to have your daily data driven decisions made for you. Try to say that four times fast. Combining sophisticated machine learning and artificial intelligence with our expertise in growth, Automated Insights will identify trends, abnormalities, opportunities and surface them to you automatically in our ranks feed so you can make decisions on.

And speaking of decision making, it was an F8 decision that I made a couple years ago that changed my approach to fitness. Now despite what Mark may have said on stage a little bit earlier, it wasn’t until about six years ago that I actually identified myself as a runner, right? And still then I was strictly a basketball player, which meant I ran no more than 94 feet in one direction and 50 in the other direction, right? The decision that I made the dog food, the social Nike plus run club app that year is what turned me a non-runner into a runner, and likely many others, because since that F8 2011, that app has been downloaded more than 58 million times across the world. Nike too is a multiplier that’s part of this community and they’ve done a great job of building a more inclusive community of global runners. They recently integrated our newest product the Places Graph which gives them free access to the same location data that’s powering Facebook, Instagram, and Messenger. Our belief is simple as that, when you can build a digital experience based on people’s real world context, it’s only a matter of time before support of offline communities get formed, like the folks that are having a great time in this photo. Big shout out to the latest of Road Warriors who is a running community made up mainly of entrepreneurs and techies who let us last year run with them in Nigeria one morning.

And as we move into the future we continue to believe that building more payments products actually will spark the multiplier effect from this community — whether it’s walking into a movie theater to watch the other F8, or walking into a sports store or walking into a church we actually think that everyone has the ability to execute a payment transaction with a simple scan of a parametric messenger code. David will come on here and talk a little bit more about this in a minute.

But that’s it. Over the past decade, F8 has grown a lot, right? We’ve gone from just one venue, we’ve gone from having just one platform, I’m pretty sure we’ve gone from having just one menu item on the food menu. But more important than what we’ve actually outgrown is actually who we’ve grown with, and that’s this community that I’m looking at right now. You know, when I look back on the last ten years and what we’ve done together, I can’t help but be optimistic about the future. You guys have entertained people, you’ve made fitness fun, you make communities safer for women around the world. You’ve built more civically engaged communities and on and on and on and on — all of these are great examples of the multiplier effect that I spoke about earlier. And what happens when we find the right ways to invest in one another and to build together, that’s the only way we’re going to build things of consequence.

Before I leave, I was flipping through some old F8 photos and looked at and found kind of the F8 package from 2007 and there was a quote in there that stuck out to me that I think is relevant for today as we look towards the next ten years together, it simply said, “Now is the time to build and go beyond whatever has been possible.”

So with that, I want to say thank you and I look forward to building the next ten years with you guys. Quickly introducing David to rock Marcus!

David Marcus – VP, Messaging Products, Facebook

Good morning everyone. OK, let’s try this again — good morning everyone. Awesome! Awesome. And thank you, Mark for my new nickname. I have a feeling it’s going to stick with me for a little bit.

All right. It’s been an incredible, incredible ride since last F8. We now have over 1.2 billion people using Messenger every month; that’s 300 million more than last year. And across our entire user base, the average number of messages sent every day on Messenger is continuing to increase rapidly. Meaning that Messenger is now a more important part of our daily lives. And we come to Messenger to do way more things — to get together in real time with group video chats, now in VR too, to share more and more visual content with our Camera and Messenger Day which is about to get so much better with our VR and camera platform, and even to challenge each other with games.

But you know what’s big numbers and product enhancements, while they’re important, you know they’re not the reason we come to work every morning. The reason we come to work every morning is the people who are using Messenger every day, people like [Emily Delabarra] whose husband is deployed overseas and who uses Messenger video every day so that their daughter can actually do her homework every day with her dad. This is why we come to work every day.

Now, let’s rewind the clock to last year’s launch of Messenger platform. I’m glad we called it a beta. Because we got a lot of attention for opening our platform and then right after we got a lot of attention for all of the work we still needed to do. And we listened. We listened really hard and we learned and then unfazed by hype or temporary skepticism, we worked really really hard for the last 371 days hand in hand with our amazing developer community, because we knew we had something great in the making.

And we delivered four solid updates to the platform since then, adding more UI elements like quick replies, persistent menus, enhanced web use, better templates, native payments, and we enabled developers to send richer and richer content such as videos and animated GIFs. And you know what, it’s working. Companies are finding success on the platform across a variety of use cases. And there are three things that I want you to take away from these.

Number one: If you direct your existing or future customers to Messenger instead of directing them to a mobile web page or to an app, you will see lift, if you build just the right experience. But don’t take my word for it. Take Meetic which is part of the group that has built a dating bot that converts at a rate that’s 30% higher than all of the other channels.

Number two: People prefer to use Messenger to interact with companies. I mean, who likes to call companies, right? So more and more companies are now providing service and support on Messenger and the results have been stunning. Take Rogers, the largest mobile operator in Canada, that has seen a customer satisfaction lift of over 60% by providing service on Messenger, and Globe in the Philippines of over 22% and has increased employee productivity by 3.5 times by adding automation.

The third thing is that financial services on Messenger are truly starting to find product market fit. From the Amex experience, that’s now richer and richer every day to PayPal which in the last couple of months has connected over a million accounts with Messenger and we have many more financial services experiences launching on Messenger just this week.

And like Ime shared earlier, your favorite artists and public figures are now also tapping onto the Messenger platform to reach their audience in a brand new way. And the list of companies and phenomenal brands on Messenger are continuing to increase by the day. We couldn’t be more honored to have all of you on our product. But none of that would be possible if it wasn’t for our kickass group of developers that has formed an unbelievable ecosystem around Messenger in the last 12 months.

Thanks to you, we’ve now more than doubled the number of messages exchanged between businesses and people to a whopping 2 billion messages a month, including automation that wasn’t even possible last year. And we now have over 100,000 bots on the platform, that’s up from 33,000 just last September; that’s crazy. And even better we have 100,000 amazing developers building for Messenger. So let’s hear from some of them.

[Video clip]

Awesome! What Phil said, “make something that’s magic”. We now have unbelievable brands and a thriving developer ecosystem on the platform but one thing we didn’t expect was how some of the experiences that you’ve all built have impacted the world in a very positive way. And I want to share one story with you.

This is Morad, he lives in Morocco and he’s one of Tarjimly’s 2000 volunteer translators. And this is a family of refugees that was helped by this incredible bot. Tarjimly was built to provide real-time translation services for refugees; it matches them with amazing people like Morad. And when they needed the most, those translators can actually even jump in on a video call and help for an urgent medical appointment, or a meeting with an attorney or simply buying produce at the market. Those use-cases can potentially save lives and provide mission critical services to the people who need it the most. These are my favorite examples of the impact of our work and when I say ‘our work’, I mean the work that all of you in this room and on the live cast have put on the Messenger platform in the last 12 months.

But enough about the past, let’s talk about today and this year, because we have something really good to share with you — a few things actually. So last year was all about creating the foundation, learning a lot and iterating. And this year is the year of scale. And I’m glad to open this new chapter with the launch of Messenger Platform 2.0. We all come to Messenger to easily find and talk to over 1.2 billion people without ever needing a phone number. So when you come to think about it, Messenger has become the de facto white pages of messaging apps. And now with over 65 million businesses active on Facebook and nearly 20 million of them responding to messages every month, and our 100,000 active bots, we have a shot at becoming the yellow pages of messaging too. So this is why it’s time for us to invest in discovery.

And like Mark shared earlier, we’ve built a Discover tab inside of Messenger that will surface the best bots and businesses interactions you can have in your region and we’re going to work with you in the next coming weeks to ensure that we populate this tab and this new surface with the right things. So work with us, fill up the categories and the regions, your bot and experiences are actually live in and we’re going to go and start rolling this out very soon.

Now, the other thing we launched last year were Messenger Codes and as we all know now, scanning QR codes is not a thing in the West. But we’re going to give this another go. And what we’re going to do for this is actually introduce Parametric Codes and a better way to scan codes, you can now do this from the native messenger camera, no need to stumble around tabs and menus. And a great example of that is what you just saw from the Golden State Warriors which is an example of an experience that they’re going to have inside of their stadiums very soon. One bots, multiple codes, multiple experiences.

Also, new to Platform 2.0 are Chat Extensions. So as you all know, right now experiences with businesses and bots on Messenger are mainly single player and one on one, and we want to change that with Chad Extensions, by empowering developers to bring their experiences inside of existing one on one or group conversations between people. So let me share one great example of that.

I’m really pumped to welcome Spotify on the platform. And Messenger — and music on Messenger is going to be a big thing this year. So let me show you how it works. You can open a drawer from our redesigned composer, tap on Spotify, search any song or artist you want and then when you’re ready to share it back in the tread, you can just tap it and share it just as easily as this; no more complex app switching and sharing flows and the best part of all of this: the music now plays live. Pretty cool, right?

And when I said music was going to be a thing on messenger this year, I really meant it, because I’m really excited to share with you that Apple Music will be on the platform very soon as well. Now music is not the only use case that takes advantage of Chat Extensions. Today you’ll see a lot of new experiences inside of Messenger powered by Chat Extensions, whether it’s restaurant reservations, food ordering, movie tickets, entertainment and many many more. And this is now open to our 100,000 developers starting today so you can build your own Chat Extensions later today.

All right. Let me talk about AI very briefly. We’ve made a lot of progress in the last 12 months and we still have a lot of work to do. But I think we can all agree that automation is going to be a critical part of our joint success and we want to bring automation to the people who need it the most, which are small business owners, because they don’t have time to figure out technology and respond to messages 24×7, they have a business to run. So we’re launching Smart Replies powered by AI and we’re starting very small with restaurants in the U.S. by enabling them to automate responses to messages they get on Messenger. And the way it works is that the Smart Replies engine basically grabs the information that’s available from your page, automatically detects the types of questions that are being asked and responds on your behalf. So you can focus on building your business.

We also launched M suggestions two weeks ago and as you now know, what this does is it basically recommends things to shorten the distance between what you want to do and getting it done, recommends things like making plans or sending and receiving money, or sharing your location or even recommends funny stickers. But I’m sure you’re connecting the dots by now.

As M gets smarter, it will start making recommendations for Chat Extensions built by you. And that is going to take a little bit of time to get it right so please be a little patient with us as we fine-tune the engines for each of these intense categories. And while we do that, what we want to do though is to see how people will interact with M social and multiplayer recommendations. So to do that we’re going to launch an experience around food ordering in partnership with And so in this case, you can see that M has detected it’s the right time to order food. So if I tap on that suggestion, I start and now social food ordering experience, where everyone in that conversation can actually add to the cart and you can see now Henry and Laura, Dana are adding to the cart and at the end I can just check out with native payments. I’m really excited to see how people interact with Joe’s new experiences that frankly weren’t possible until today.

Now, last year right after Thanksgiving we launched games on Messenger. Since then 1.5 billion games have been played. So it’s time to take games on Messenger to the next level. And we’re going to do this two ways: One is we’re introducing rich gameplay that will enable real-time gaming and turn by turn games. Two is we’re also opening up a game tab so that you can find all of the games that you’re playing with and the challenges that you’ve been challenged by your friends. I’m really looking forward to this and I don’t know about you but like I’ve really enjoyed challenging all of my friends on games on Messenger in the last couple of months. And now that I have turn by turn games it’s game on again.

All right. Let me close by saying that 2017 is going to be an amazing year together, with massively improved discovery, Chat Extensions, better AI and more engaging games, I really believe that you have everything you need to take the Messenger experience to the next level with your product. I really can’t wait to see what you’re going to build this year. And the one thing I want you to know is that we’re fully committed not only to listen to you but also to work side by side with you to make you successful on our platform.

Now let’s go do it. Thank you and have a great F8!


Related Posts