What if? -What if we can get…
-What if we can… If I could figure this out. -Just a little more realistic.
-I see it. What if this doesn’t work? What if? What if we try to optimize
the assets first? -If we combined VR and immersive theater.
-If we could… What if I could time slice
the boat simulation? -If we could just shave one millisecond.
-Maybe we could adjust it. -If I could visualize…
-What if we could get details so crisp? Hey, everyone, welcome to OC6. We have a lot of awesome stuff
to go through and show you all today. But first, I wanted to start off by… I’ve been saying for a while that we think that augmented
and virtual reality are going to be
the next major computing platform. I wanted to start off by talking
for a little bit about why. You know, even before I started Facebook, when I was still in college I was studying
psychology and computer science. Because for a lot of us here
it’s not just about building technology it’s about building technology
that works the way that we do. And the first thing that you learn
in Psychology 101, is you go through the anatomy of the brain and you started learning
just how much our brain’s evolved and how most of them is really
about connecting and communicating with other people. We have the language centers
that almost no other animal has. You have the visual cortex
where we’re specially dialed in to understanding people’s emotions. If you’re talking to someone and
they move their eyebrow a millimeter, or their cheek a little bit,
you’re gonna notice, because that means a
completely different thing. If something moves over there
you’re not even gonna pay attention. We have mirror neurons that when we’re
interacting with someone we understand what’s going on in them, inside of us. So, we experience the world
through this feeling of presence and the interactions
that we get with other people. Which is why Facebook’s technology vision
has always been about putting people at the center
of your computing experience. And we’ve mostly done that so far
through building apps and I don’t think it’s an accident that a lot of the top used
and biggest apps that are out there are social experiences that put people
at the center of the experience, because that’s how we process things. But, there’s only so much
you can do with apps without also shaping
and improving the underlying platform. I find it shocking
that we are here in 2019 and our phones and our computers
are still organized around apps and tasks, and not people and the things
that we’re actually there present with. I feel we can help all of us together deliver
a unique contribution to this field by helping to ensure
that the next platform changes this. And the defining aspect
of augmented and virtual reality is that they deliver
the sense of presence, where no other technology platform
before has ever delivered. You feel like you are right there
with another person or in another place. Now, sure we have a lot of work to do before we’re gonna get to
that perfect form factor that we all want, but you can already see glimpses today of how the devices
that we are all working on are helping to deliver a feeling
where we are more present with the people we’re interacting with,
not less like a lot of the other platforms that are out there today, so,
longer-term, I think this is gonna add up
to a lot of really big changes. We’re gonna be able to live
anywhere that we want and be able to feel like we are present
with the people and the jobs and opportunities that we want
to have access to anywhere we want. That’s why I’m so excited about this
and I know that a lot of you are working on this, ’cause you’re excited
about similar things and it’s awesome to be in this journey together. So, now, in order to do this
we need to build a couple of things. The first, is we need to build
a lot of technology that’s gonna help advance
and deliver this feeling of presence. The second thing that we need to do
is build an ecosystem of all the experiences
that need to be delivered here. So, on the ecosystem side
the first step that we need to reach is reaching critical mass
in the community. And once we get to a certain size
in the community it’s gonna become economical
for all developers from independent folks up to the biggest
AAA developers that are out there to build content for VR
and once we reach that point the amount of content
is just gonna explode and that’s gonna push adoption further. So, in getting there,
Quest is off to a great start. It’s only been on sale for four months now and we are selling them
as fast as we can make them. But even… So, even more importantly though
is that retention is good. So, people are using it week after week
after week, which is a great signal for how the content ecosystem
is getting stronger and VR is becoming something that more people are stepping in and out
of on a day-to-day basis rather than this
one-off special experience that you need all
this different equipment for and you need to dedicate a whole
room in your house to. So, these are all really good signs. And to get to critical mass, we are
focused on building forward compatibility into all of the new devices that we build so that all the content
that you are all making just works across everything.
Now and going forward. So, the goal is, if you develop something
for Quest today then our goal is to make it work
on all future Quests as well. And that means when you buy a Quest, you’re not just buying something
that’s gonna be locked in time, as all the new content is built
for the future is going to get better and also as we deliver major
software updates, the platform is also
just gonna dramatically improve and that’s gonna be a big theme
of what we want to talk about today. But of course the biggest part
of this depends on all the awesome stuff you’re building. So, we need to build an ecosystem
that supports all of you, which is why I am proud to share the news
that as of today people have bought more than $100 million
of content in the Oculus store. And since Quest came out
just several months ago more than 20% of that is from Quest
already and growing really quickly. So, I just wanna take a moment
at the start of this conference to thank all of you
who are really the ones who are pushing this ecosystem and community forward,
so, thank you, guys. All right, so the experiences
that you are all building for Rift, are as good as it gets in VR. But right now
when you build something for Rift that library of content isn’t available
to people who are on Quest. So, we’ve been looking
at what we can do here. And we want to… So, the first thing we wanna
announce today, is we’ve been looking at how
we can make this possible. So, we’ve made a major software update and with a new product
we’re calling Oculus Link that is going to make it, so that if you have a gaming pc
and a USB C cable, you are now gonna be able to run
all the Rift content on your Quest. So, this means that starting in November, when we ship this update
your Quest is basically a Rift now too. Now, this is gonna work
with most USB C cables that are out there. So, you don’t have to buy ours,
but we’ve also designed a dedicated cable that’s gonna maximize the throughput here,
it’s gonna charge your Quest if your pc supports that too and it’s long enough
to make sure it provides you with maximum flexibility
and freedom of movement. You’re gonna want to check this out,
we’re shipping it in November and we’re really excited
to get this out there. That’s one major software update
that we wanted to talk about, but I want to move on to another one that’s about to make your Quest
a lot better too. So, a lot of technologies that we’re
working on here are foundational. Not just for virtual reality, but also for the future of what
we wanna do with augmented reality too. And one of the really foundational things that we feel like we wanna improve
and make a lot better is input. Because in the future you’re gonna be able to interact
with digital objects with your hands, just as naturally as you can
with physical objects in the world. Now, I’m sure that a lot of you remember the first time that you tried out
touch controllers, Right? How awesome it was, you were just
able to look down in virtual reality, reach out and grab something
or throw it or do something with it. And it really just added this whole
element to the experience. But as soon as we made that possible,
it immediately opened up this bigger question which is,
“How can we do even better? “How can we make it so that
instead of having to use hand controllers, “we can just use our hands?” And I’m excited to announce that early next year
we are launching hand tracking on Quest. All right, wanna see it?
Let’s check this out. All right. So, this means no controllers, no buttons, no straps,
no external sensors. Just full range of motion in your hands.
Even if you’ve spent a lot of time in VR and you’ve spent a lot of time
with touch controllers, I think that the first time
that you get a chance to experience this and you just wiggle your fingers and you see that full range of motion
in your hands, it takes the experience
to a whole new level. You are gonna get a chance
to try this out on the demo floor after the keynote today,
so I’m really excited to get this into all of your hands. So, you know, this is what I mean when I talk about building a platform
that improves over time. Six months ago,
if you wanted to get into VR, you needed a PC, cables, sensors,
hand controllers, and half a dozen physical objects. And, you know, now,
soon it’s just gonna be a headset that you can bring with you
anywhere that you go that has full inside-out tracking,
completely wireless and now your hands
are just gonna be there too. So, there’s a lot of work
that we still need to do to get to where we all want. But I think what you’re starting to see
is the hardware is getting out of the way. And with each step,
we are getting to a more immersive and natural experience. All right. So, now I want to talk about
the future of input for a minute, though. Because hand tracking is great,
it doesn’t require controllers, but it still requires you
to use your hands. And in the future
we want to get to an input where we can just think something
and it happens. So what people call a neural interface. And earlier this week, we announced that the CTRL-labs team
will be joining us. And, you know, they’re the leading team
working on neural interfaces. They have a lot of the best researchers, computational neuroscientists and more. You know, they’re working on a wristband. It picks up electrical impulses
that are sent through your nervous system and turns them into digital signals
that you can use as input in virtual and augmented reality. It’s completely noninvasive.
So there’s no surgery, no implants You don’t have to get
holes drilled into your head. It’s just a wristband. It sort of gives you the sensation
of being able to interact with digital objects just by thinking. And, you know, this is clearly early.
It’s gonna be a number of years before this gets into any of the products
that we’re shipping, but it works. And, you know, they have a dev-kit
that they are shipping already now. And now that the CTRL-labs team
is going to be joining our Facebook Reality Labs team that works
on augmented and virtual reality, we’re gonna invest and make sure
that this is a foundational part of the input
for the next computing platform. And I’m really excited about this one. I just wanted to talk about it
for a few minutes today. All right. So, the last thing
I want to talk about upfront today is delivering a social experience in VR. All right. So I talked about, for a while, how our technology vision
is putting people at the center of the computing experience. A big part of how we do this,
is by building technology that advances the feeling of presence. Right? So more immersive,
getting the hardware out of the way, you know, better, natural, more UI, better, more realistic avatars
which we’ll talk about later. That’s one part of it. The other part is basically
building the software experiences that put people
at the center of the experience. And, you know, that’s kind of
our bread and butter as a company. Right? We build a lot of the best
social experiences for phones and for computers, and we want to do this in virtual reality as well. So, great social experiences usually aren’t just a place
where you interact. They give you the ability
to define the space around you. And we’ve been working on something
that does this in virtual reality. It’s kind of
a virtual reality take on this. And we’ve been working on this
for a while, and today we want to announce an experience that we’ve been
working on called Horizon. And in Horizon, you are going to be able
to build your own worlds and experiences. You’re going to be able
to play games, explore, you’re going to be able
to hang out with your friends and of course, meet new people. And because everyone is going to be
able to create their own spaces and experiences within it,
Horizon is gonna have this property where it just grows and expands
and gets better and better over time as we focus on building this out
for many years to come. So, let’s take a look at this. Beyond our world, there’s another world. And it’s right here… On my face. Welcome. This is Horizon. Think of as your guide/self-appointed
spokes-avatar here to show you around. You know,
Horizon’s filled with possibilities. You can play stuff,
make stuff, fly stuff… Really love the ‘stache, Stuart. -What up, Stuart?
-Wait, I want a mustache. Horizon isn’t about rules
or limits or pants. Or people telling you
not to fly an airplane while drinking your fresh ground,
fair trade, French press morning coffee through a curly straw.
Isn’t that right, Debbie? It’s about getting out there
and trying new things, making your mark. Making friends with an Australian
named Mark. Actually I’m from New Zealand. And you can even build
a world of your own. Like laser tag moon landing. Or this island-y place
with these cutie-patooties. In Horizon, the world is your lobster. Isn’t it “oyster”? That too. So come join us. A never ending, ever changing
world beyond your world is waiting! Debbie gets it. All right. So we have built Horizon to be a welcoming
and inclusive community from day one. A lot of the social tools that power it are the same ones
that we’ve built over the years to power Facebook and Instagram as well. There’s a whole range
of expressive and diverse avatars to get us started. And the best part is that
the creator experience is built right into VR. So, you don’t need to be able to code
in order to create something. Although, that can certainly help in building some
of the more advanced things. You don’t need to take off your headset
in order to make something. You can make some really neat stuff
in two hours, then invite your friends in and have experiences
and hang out with them right there. So, this is also… This is early. But this is another step towards building
the kind of social infrastructure that we believe is going to be important
in the future. So, you should be able
to hang out with your friends, join groups, create events, share ideas in VR
just like you can online. And, of course,
now you’re going to be able to do it with that added feeling of presence, like you’re right there
with the people who you’re talking with, that no other technology platform
that has existed before delivers. So, we are launching this next year. We’re putting
the finishing touches on it now, and I’m looking forward to see
what you all do with it. All right. So, you know, in a lot of ways, it’s kind of hard to believe that
this is only our sixth connect. Right? Cos it’s pretty amazing
how far the technology and how far our community have come
in just a short number of years. At the very first connect,
about five years ago, the Oculus team had just joined us and we showed an early demo
of the Crescent Bay technology which of course went on
to become the Oculus Rift. And a couple of years later,
it went on sale for $600, plus you needed a $1,000 gaming PC
and all the cables we showed earlier to make this work. And, you know, now here we are
and we are shipping a hand tracking, on a wireless headset that costs $399,
that fits in a shoe box and you can take it
anywhere that you want. And our community here is well on the way to building a strong ecosystem
with all of the content and awesome stuff
that you guys are all building. So, this is going to be
the next computing platform. And it’s only going to get more exciting
and awesome and important from here. And this is one of the areas
where I’m just so grateful to have a chance to be on this journey
with all of you. So, I hope you have a great
rest of this connect and I hope you enjoy everything else
we show you. I’m gonna hand it off to Boz, who leads all of the augmented
and virtual reality efforts at Facebook. All right, have a great day. Now we know
what makes virtual reality powerful. It lets you suspend physics. There are things you can do in a headset
that you can accomplish no other way. You see this power in games,
in media, in business. But Facebook’s ambitions
are far greater than that. We are building a future
that allows you to defy distance. Because we know that physical distance
can translate to emotional distance. And we ask, “What is it that allows people
to feel close to each other?” Three social psychologists
took on that question and produced
the relationship closeness inventory, which suggests that
there are three dimensions to closeness. First is the frequency
of your interactions, just how often you interact with somebody. Then there’s the diversity
of your interactions, how many different types of things
you do together. And then, finally,
there’s the strength of your interactions, which is how much does someone
influence your decisions. Every single one
of these dimensions is easier if you’re physically near someone. But we have demands on our lives that sometimes separate us
from a few miles, and sometimes
they separate us by continents. And VR could be the tool
that erases those miles. Let’s start with frequency. It’s pretty straightforward.
If you are not physically near someone, it’s harder to connect with them
very often. And it’s easy to see how VR could help. You could both teleport to the same place
as often as you like no matter where you are
in the physical world. And the same things that prevent you
from being physically together also limit the range of activities
you can engage in together. Phone calls and text messaging are great, but there’s not a lot of variety
in interaction there. Meanwhile, in virtual reality,
you could have a much greater diversity of experiences
than you could have even if you were physically together. Our library already has
more than a thousand different experiences people can engage in. Right? Because of all of you,
people can have fun together no matter where they are in the world. They can create art,
they can defeat enemies. You know, my personal favorite,
they can dance. World’s smallest dance just there.
You saw it. So the final area
of relationship closeness is strength. And I have moments in my life
with people that I care about, where just being
physically present with them is enough. That kind of relationship
is hard to sustain over distance. With VR, this kind of intimacy
doesn’t have to be bound by timezone. While we can, maybe,
just glimpse it today, I’m convinced that in the future,
this kind of profound presence with the people we care the most about
should be possible in VR. So, overall, I would say, when we get
a sense of human closeness in VR today, we don’t experience it
nearly often enough, and it’s just a glimmer
of what it should be. Let’s talk about the work it’s gonna take
to get there. So, when it comes to frequency, we just need more people
to have access to VR. And we’re working on that.
We’re building better VR headsets. And we’re already working on headsets
that have better displays, better optics, better compute,
all to make it more comfortable, and more immersive and more inviting
for people to get a headset and put it on. We also need to give people more reasons
to put the headset on and that gets towards the diversity. We have some amazing titles
in our catalog and more to come, but our categories
are still far too narrow. When it comes to strength, we have to
continue to pursue our vision of presence. Actually, getting hands in the headset
is a good first step. Our avatars continue to make progress, but they’re still not nearly
expressive enough. And these are all
just the building blocks we need to make VR tool
to enable greater human connection. When I think about the future
we’re building, I think a little bit
about the relationship I have with my dad. I’m a big Warriors fan, and I’m only a Warriors fan,
or even a basketball fan, because of how much I enjoy
watching games with my father, who is reliably wearing a tie-dye shirt. VR makes it possible for my dad
to invite me over to watch a game, to teleport into his living room
and sit together on this couch just like we’ve always done. Just amazing work. Excuse me, one second. -Hey, Dad.
-Hey. -I’m actually working right now.
-I’m just about to watch the game. I want you here. Andrew, I miss you. Come on. I miss you too, Dad,
but I’m at work right now. If you love me, you’d meet me here. I’ll see you at home. One second. You can’t miss the Warriors.
He plays like an artist. He’s a Picasso. He’s better than Picasso. Don’t get me started on Klay ’cause
I will talk all day about that artistry. Can you get beers here,
like an IPA or something? We’re still years away from having beers
in the headsets, Dad. It isn’t the same as being there, but I tell you,
that might be the next best thing, that is an accurate representation of what
it’s like to watch a game with my dad, except there would be
a lot more complaints about traveling. I want all of us to be able
to have that experience in VR, things we care about
with the people we care about. This is one of the demos you’ll be able
to try out later on the show floor, being in a space that’s been
reconstructed with someone else. So, we know that VR has the power
to bring us closer together. It is and will remain
the most powerful way for us to connect across distance. But here’s the thing.
Even VR has its limits. There are scenarios where people
will struggle to connect in VR. Look at all of us.
We’re here in the same room. If we all had VR headsets on,
that could be a little antisocial. Or consider the scenario with my dad. Let’s add one little twist. Let’s say my friend has joined me and
wants to watch the Warriors game with us. But he’s physically sitting
right next to me. It’s true, he could put a headset on
and join us in VR, but we’re sacrificing the richness
of actually being together, and that doesn’t feel great. So we ask ourselves, “What can a technology like augmented
reality do to help us feel closer?” Today, our interactions
with our phones split our attention. We cannot look at our phones without
looking away from the world around us. But augmented reality
could make it possible to connect with the people who are near us
and far from us at the same time. And through AR, there will be an infinite, diverse expanse
of things for people to do and to see, synchronous, shared experiences
that can deepen relationships. AR will also let you connect with people
no matter where they are, even if you’re walking down the street, something that’s very hard to do
in virtual reality. To get to this future, we are building AR glasses. Now, we have a few working prototypes,
but these are still a few years out, so in the meantime, we’re focusing
on the deep tech stack necessary to bring these to life. Today, Spark AR is the largest
augmented reality platform for the phone. We’re working on deepening the technology to bridge the physical
and digital divides. At Facebook Reality Labs, our research teams are starting
to build the core infrastructure that will underpin
tomorrow’s AR experiences. We call this research live maps. It’s a shared virtual map of the world that involves machine perception,
computer vision, and a bunch of core technologies,
and software and hardware that will also be important
for virtual reality. What does this research look like? Imagine the same reconstruction I did
with my dad where we were sitting in a room together,
but this time at planet scale. Rather than reconstructing
your surroundings in real-time, the glasses are going to tap into
pre-existing 3D maps of the space. Avatars come into play here, too. With a 3D space, avatars could teleport
anywhere in the world. You’ll also be able to search
and share information about the real world just as easily
as you can the online world today. And a powerful assistant
can bring you information that you need in an instant. At its core technology like this
and on this scale, we’ll rely on crowd sourced information
captured by connected devices. Here’s a closer look. The world around us
is constantly in motion. We need a new kind of map to help us
navigate this ever changing landscape. A map like this is about more than
just moving through the physical world, It’s about enhancing the way we interact
with each other and our surroundings. The vision of our live maps research is to bridge the physical and virtual
divide to bring people closer together. To achieve this,
live maps uses machine perception to construct multi-layer representations
of the world showing where you are in space, recognizing what things look like and understanding
the intrinsic meaning of objects. Connected devices like smartphones
and AR glasses will scan the surroundings to create a live dynamic index
amplified by crowdsourced data allowing the maps to recognize when things
have changed, and update automatically. Ultimately, our live maps research aims
to empower people to connect and share in deeper,
more meaningful ways, Whether it’s improving how we
access information or understanding the context
of a situation to deliver shared experiences
through augmented reality or proactively serving up
relevant information in highly personalized ways. Live maps will one day transform
how we all engage with the world making it feel more immediate,
more intuitive, more natural, more human. With live maps research,
we’re exploring today how the world will be
experienced tomorrow. So all of these technologies are being
built to help people feel closer. We’re building new VR headsets.
We’re building immersive experiences. We’re building AR glasses, all to enable more frequent,
more diverse and stronger connections to enable closeness. And with that, I’d like to introduce
Stephanie Lue to the stage to share more about how we’re building
the best VR platform. Thank you. I can’t wait to see what the future
of AR and VR holds. Together, we’re building that future. Over the years, we’ve been working nonstop
to enable sustainable ecosystem for you, our developers, and our users. Our VR headsets form the core
of our ecosystem. Go is great for the entertainment, while Rift S continues to unlock
the most rich, immersive experiences. And Quest? What can I say?
Quest is changing the game. You simply can’t beat this combination
of content, form factor and price. We want to double down
and unlock the hardware’s true potential through our software innovations. So today, I will share some updates
we’ve made to our core experiences along with improvements
to our development features and tools. And finally, I’ll talk about
how we’re pushing the bounds of what’s possible in VR.
Let’s take a look. Since Quest launched, we’ve already incorporated many top user
requests through our monthly updates including more guardian settings,
casting support for Beat Saber, and the ability to launch apps straight
from your phone, so you can help guide a friend
while they check out your Quest. And that’s just to name a few.
We’re not stopping there. We’ve got a ton of new stuff in the works including one that’s been
one of your favorites on Rift S. I’m excited to share
that an upgraded version of Passthrough, Passthrough+ will be available
for Quest users starting next week. Passthrough+ gives you a comfortable,
stereo-correct view of your surroundings while you’re wearing the headset. And it’s useful any time you step
outside your play space. To make this possible on Quest, we’ve applied techniques
from high-performance imaging processing and advanced 3D computation, resulting in a similar visual quality
to what you’ll see on Rift S. What’s even better, with Passthrough
on Demand shipping later this year, you can pull up Passthrough+ at any time. Next time you hear your friend trying to
steal your pizza off the kitchen counter, turn on Passthrough+
to catch them in the act, then, get back to slicing pineapples
in Fruit Ninja. Time and again, we’ve heard the community
ask for this next feature. Oculus Go has been an entry point to VR
for many people who then, upgraded to Quest. We’ve heard from these VR fans
and developers alike that there are tons of amazing Go apps that you’d love to experience on Quest. I’m pleased to announce
that we’re bringing a bunch of really popular Oculus Go Apps
to Quest. And there’s more.
We’re also offering free upgrades. So owners of paid Go apps can get
the Quest version if it already exists. If you own Thumper on Go,
but never got around to buying it on Quest,
between now and the end of the year we’ll upgrade you for free. And starting next week,
there will be more than 50 of the most popular Oculus Go Apps
available to Quest users. So, that’s some pretty cool stuff, right? Now let’s talk about some new tools
for you, our dev community, we want to make it easier for you
to develop amazing experiences, promote them for people to discover,
and connect to more customers. You’ve shown that it’s possible to build
high-quality performance VR apps in a mobile chipset,
and we are committed to making it easier by giving you more tools
to profile and debug. We’ve contributed to RenderDoc,
the open-source frame de-bugger, so it supports all of our extensions
and uses less RAM. This means it will run more smoothly
on Quest. We’re also improving RenderDoc
to report out more performance data. And later this year, we’ll be enhancing
our OVR Metrics tool, so you can see more metrics right in the headset
while your application is running or you can view on a computer
and save the information for later. We’ve also added support
for Vulkan Multiview and fixed foveated rendering
in our platform and in Unreal Engine. This can boost performance significantly.
And later this year, we’ll expand our Vulkan support
on Quest to include Unity and Vulkan validation layers
for easier debugging. And speaking of Unity, you may have seen that we’ve recently enabled
Unity’s GPU Profiler on Quest and Go. Were also working with Qualcomm
to develop a low-level GPU profiling API. This will give you more
detailed information to increase accuracy in pinpointing
and resolving performance issues. Now with all of this, our goal
is to make it easier and faster for you to create the next generation
of VR experiences. And once you’ve built an app, we want to help you
effectively reach your audience and understand how your app
is performing. That’s why, later this fall,
we’ll be adding Purchase Funnel Metrics into our
developer analytics dashboard, providing you new metrics on Quest
to help you understand how your app is performing in the store. You’ll be able to explore the dashboard
to view conversions as well as metrics related to browsing, searching, and engagement
with your product detail page. Now you can measure the quality and impact
of your art and product pitch. So you can test, refine, and improve. Another great way to promote your app,
Mixed Reality Capture for Quest. This lets you mix real life
and VR footage into a single video. It’s an awesome way to force multiply
your marketing efforts for your apps and get people really excited about VR. You can clap for that. We recently shipped the ability for you,
our developers, to create these mixed reality videos. It’s already supported
in our Unity and Unreal integrations. And later this fall, we’ll be adding
native support along with a tool to make it just as easy
for content creators to be producing
these mixed reality videos. Pretty cool, right? Yeah. We’re also working on features
that are good for customers and for you by giving you the flexibility
to reach and connect to a wider audience
no matter what device you build on. You heard Mark talk about how
it will open up the ability to play Rift apps on Quest,
later this year. This is especially great
for Rift developers who’ve been with us
since the very beginning. Your reach extends when more people can access the content
that you’ve created. It means you can build
high-end PC experiences and also take advantage
of the biggest possible market. Quest owners who have a gaming PC
will have access to the deep library of stellar Rift content. They’ll have the best of both worlds. The best games from Rift
when connected to a PC, and the portability of a Quest. And now, PC gamers who are deciding between Rift or Quest
have even more options. Remote rendering is fantastic
for development, too. You can iterate more quickly
by running on a PC, while streaming the output to your Quest. Simply hit the play button
for the Unity or Unreal Oculus plug-in and you’ll be able to preview
right on the device. No need to compile
and transfer new APK first. This is… Yeah, I know. I’m excited about it. It’s development at PC speed
for a mobile device. And beyond what we’re shipping today, we’re also imagining, innovating,
and building things for the future. Things like hand tracking
which will revolutionize input for VR. This is going to be a game changer. It will unlock new
and intuitive interactions in your apps like never before. And we’ve all seen how truly magical it is
when the hardware melts away and you can interact naturally in VR. We’ve pushed on engineering boundaries
to unlock this capability on Quest, that was previously thought
to be impossible and we’re doing it just a few short months
after Quest launched. Let’s take a look
at the computer vision magic that makes this all possible. Our hand-tracking team
has developed state-of-the-art methods of applying deep learning to understand
the location and pose of your hands using just the on-board Quest cameras. No need for additional cameras,
active depth censors, or extra processors. Instead, we have model-based tracking
and deep neural networks to accurately infer where your hands are
and what they’re doing. Including exactly
what your fingers are doing. And then we reconstruct those poses
nearly instantly into VR. We’re doing all of this in a mobile
processor without compromising CPU or GPU. And we’re also using
our inclusive AI frameworks to test hand tracking for a wide range
of people and environments. Early next year, we’ll release
a beta for Quest users and we’ll ship an SDK,
so that you can start unlocking these new interaction mechanics
in your apps. With incredible software innovations
like hand tracking in an ever-growing ecosystem of people
hungry for more VR experiences, we are just getting started
and we’re so excited that you’re with us. I can’t wait to see
what we’ll build together. Now, to talk about how we’re building
our platform to be social at the core,
here’s Meaghan Fitzgerald. We all know that VR can enable
meaningful connections, but there is still a ways to go
until it’s as easy, seamless and fun. as it should be to find
and form communities in VR. To get there, we need to enable more ways
for communities to form and grow and more things for them to do and share. We think about this in three ways. First, social interactions should be
possible across everything in VR. So we’re building social features that let people connect,
communicate, and share. Next, people need places to go
and things to do. So, we are building
dedicated social places where you can meet people
and find great new experiences. And finally, creators, builders,
and community leaders need the tools to invent entirely new media, games,
and even worlds in VR. This is how communities will be empowered
to connect, create and expand. And we want to do all of this
at the scale of Facebook. So starting later this year,
we will begin to roll out a completely new social layer across
the Oculus platform, powered by Facebook. You’ll see some changes in how
social features work on the platform. You’ll login with your Facebook account
to use the social features Facebook is known for
while using your Oculus identity. This is going to enable a lot of new ways
to connect on the Facebook, on the Oculus platform. We’re adding chats, so you can message your Oculus friends in or out
of the headset. And we’re adding events,
so you can organize a tournament to play Racket: Nx with your friends. And we’re adding the ability
to post to Facebook from VR, so you can share your favorite moments
with your VR communities and groups. Over time, we plan to power more
and more of our social infrastructure with Facebook to realize our vision
of a socially-connected VR ecosystem. For the people who choose
to login with Facebook, we’ll continue to add new ways to find
and meet people, so it’s as easy to connect with others
as it is in the physical world. And for developers, this new connection
with Facebook supercharges discovery of your app or game by creating
more surfaces where people can engage. With all of these new places to connect,
we want to work with you to give people even more ways
to discover your app. So, today, I’m excited to announce Destinations and Rich Presence
for developers. Destinations let you define deep links into places in your app that can
be surfaced across the Oculus platform. And once you’ve defined your Destinations,
Rich Presence lets people share when they are in those places,
so their friends can easily join them. Starting next month,
people will be able to see their friends’ Rich Presence locations across the Oculus headset
and in the Oculus mobile app. And of course, there is a privacy setting, so people can control what they share. Over the coming months,
we’ll add the abilities for developers to broadcast destinations
across more places in the platform and places where people can share
their Rich Presence locations on even more surfaces,
including Messenger and Facebook. With these deep links,
people can remotely launch a specific VR destination
from their laptop or mobile phone. It’s like a portal that takes
friends directly to each other. All of this improves discovery of your app
through social word-of-mouth. But now, let’s talk about the Destinations
we are building and how we will empower people, communities, and creators
within those places. One thing we know people love
to do in VR is watch media. Just look at Oculus Go, it’s become
a top product for immersive entertainment. But we know our media
is not always easy to find and when you find
what you want to watch in VR, it’s not always obvious how
to invite friends to co-watch with you. This all needs to be simpler. So, later this year, we’re making
some major updates to Oculus TV, on Oculus Quest and Go. The updated Oculus TV will be the go-to hub for all media
on the platform. So, there will be one place to discover
content of all kinds. From your favorite media apps like FandangoNow to Prime Video VR, as well as immersive experiences like the Emmy-nominated
Traveling While Black and 360 videos from top creators. And of course, it will be easier than ever to hang out with friends,
kickback and watch things together. But Oculus TV isn’t just
for the people who watch media, it’s also for the people who make it. It’s a platform where people can discover the immersive content you’ve created. So, to make this even easier, today, we’re introducing Media Studio. A VR media management tool
where you can upload and publish your immersive
and VR-native media content. Media Studio… Yeah, it’s… Great way to get
your immersive content on the platform. It’s a place where you can manage
your immersive asset library and get performance analytics
about your content. We’re already hearing that these tools
changed the game for our early partners, like Baobab. Their team was able to upload and preview their latest film,
Crow: The Legend, and distribute it both in and out of VR entirely using the tools in Media Studio. But beyond media, people want
all kinds of social activities in VR, just like in the physical world. They want a place
that enables endless exploration and empowers communities. That’s why we are building
Facebook Horizon. It’s the start of an entirely
new social VR world for Oculus Quest and Rift. With Horizon, we are learning from our experience
building Facebook Spaces, Oculus Rooms, and Venues. But our vision is bigger. A place where people can explore, play, create, and connect with others in a vast, thriving virtual world
where anything is possible. Where you can drop in
for a quick flying game of Capture The Flag
with your college friends that live around the country. Or try your hand
at building a tropical island and invite others to come check it out. Or organize a weekly painting session with folks who love art as much as you do. Horizon is launching
with a closed beta early next year. Here’s what you will find there. A central, bustling public space where people can gather, meet, and explore great new experiences. Expressive and diverse avatars that let you represent the you
you choose to be in VR. And magical portals
that teleport you and your friends to great new experiences and games. We will start with a few games
built by our team, like Wing Strikers, an aerial team sport. We can’t wait to see
how these inspire other people to contribute their ideas to this ever-expanding virtual world. Because we don’t want to build it alone. That’s why we’re adding worldbuilding and creator tools right into Horizon. Now, you modders, builders, creators and artists can bring your VR visions to life and add new experiences to Horizon. Everything from making
a custom t-shirt for your avatar to building complex multiplayer games
designed from scratch. We want to give everyone the superpowers to express themselves and create without taking off the headset. We will even have visual scripting tools, so you can build interactions
into the experiences you design. It’s very exciting. And when it’s this easy to be creative and express yourself and add new experiences, in Horizon, there is always
something new to explore. And of course, since Horizon is
designed for communities to thrive, we want it to be welcoming to everyone. We are focused on promoting
safe and positive interactions. With social tools like groups and events, where you can find people
who share your interests, in-product guides to help newcomers learn about the world. Plus, built in-reporting, blocking, and moderation tools. So you are always in control
of your experience. All of this is hard to get right. But it’s worth the effort. And over time,
Horizon will get even better with even more to explore and do as communities grow and thrive. Because anyone can help us
create this world. We want to build the next generation of virtual communities together with you. Horizon is launching with
a closed beta early next year, but everyone here at OC6
will get an invite to join. You will get an early look
at new experiences, see what we are building,
and have a chance to create some experiences of your own. This is the next step on the journey towards VR with people at the center and the worlds that they will build. We will see you there. Now, let’s look at how
VR is changing the way we collaborate and train at work. Every day at the corporate office
we make decisions that impact the team members at the hotel. So we really needed to make sure
that our corporate team members understand the complexity
of working in a hotel. Oculus for Business has really shifted
the way we work. We can truly upskill team members faster and really focus on empathy building which was a game changer for us. The virtual factory tour became a use case that was a no-brainer. It was, “This is what
we need really need to do.” We can now take anybody and we can give them
the factory experience from their desk. Seems real. It’s a real representation
of what the factory looks like. Besides saving money on travel,
it’s a safety factor. This is that ultimate bridge to break those barriers down. Oh, man, that was awesome. Fantastic. The way we see the potential of Quest is that much lower barrier of entry. Fact that I showed a car
that I worked nine hours on, that’s something on a timescale
that we’ve never done before. I knew this was going to be a big deal. This is the time now for virtual reality to go at scale, to go large. Quest opened the floodgates. It’s just phenomenal. Quest is like a superpower,
it’s completely wild. This is the moment for VR. And Quest, I truly think,
has made that happen. It is completely wild, isn’t it? I see this video and I get chills
again and again. From the first day I started working here, all I could think of is how we could
harness this technology for work. And not just my work, everybody’s work. As a technologist, it was tempting to approach it with an ROI calculator. How much money is saved
at the manufacturing plant, at the store, at the office? But looking at how companies use VR today, what I found is that the impact of VR goes beyond what can be easily measured. With VR, employees will empathy for each other and their customers. They strengthen the company’s culture. And they become high-performing teams. For example, the teams at E.ON Energy, they collaborate across distance. They’re being more communicative
and more productive. All the Walmart… The Walmart associates, with empathy training,
are more attuned to the needs of their shoppers. And the claim representatives
at Farmers Insurance, they practice to refine their expertise. VR is improving outcomes. And a positive outcome
is never more meaningful than when you’re training
to treat a patient. To tell us more, please welcome my friend Sandra Humbles from
the Johnson & Johnson institute. Have you ever wondered how surgeons learn to become surgeons? It may come as a surprise to you that the surgical training model hasn’t really changed in 100 years. Typically, a surgical resident begins by observing
more experienced surgeons, and then gradually, they are given
more and more autonomy until they can perform on their own. But the reality is, the training model
is no longer sustainable, and as technology is accelerating, a surgeon has to learn
more and more and more. So, with a head for health, a heart for teaching, and a passion for our patients, the Johnson & Johnson Institute is ushering in a new era
of modern, surgical training. We have equipped our facilities
all around the world with virtual reality training, so surgeons can learn, rehearse, and master skills on their own time and at their own pace. So, they are better prepared
to then operate on real patients. So, from the start,
we knew VR could help our surgeons learn new procedures faster
than traditional teaching methods like watching a video
or reading textbooks. Now, we have the backup
to show VR’s effectiveness. The Imperial College, London,
conducted a study about the VR training modules on the anterior approach to hip surgery, to illustrate how you could improve
the training at the residency level. I’m so proud to be able to present to you today the key findings. 83%. 83% of those trained in VR could then go into the lab setting with minimal guidance, whereas… None of those traditionally trained… Truly, VR is making a difference. Now, look here.
The higher up on this graph, the better the performance. Again, you’ll see, those trained in VR completely outperformed
the non-trained surgeons in the study. This independent validation of our work
is propelling us to now go even further. What we would have thought
was nearly impossible, just a short while ago
is now within our reach, thanks to Oculus. With the portability of Quest and the support
of our development partner, Osso VR, the Johnson & Johnson Institute
will be scaling access to our VR training platform
quicker than ever before. We want to make VR available to every surgeon,
in every hospital around the world. Focused on training the procedures and ensuring our products are used safely in a way that’s more time-efficient,
cost-effective and measurable. So from the surgeon, to the nurse,
to the scrub technician, we will use VR to provide
anytime, anywhere training. So together, we will deliver
on our mission to support those on the front line of care and help improve patient outcomes
around the world. Thank you. Thank you, Sandra. These are just some examples
of the enterprise applications that you have pioneered. But it has been hard. The infrastructure to support you to target this fast growing market
didn’t exist. As the value of VR becomes evident, companies move from wondering
if it’s worth it, to considering the practical implications
of adopting it. As one customer recently said, “If it doesn’t scale,
it doesn’t matter.” So here’s where we come in. On the hardware side,
with Oculus Go and Oculus Quest, we now have the ideal form factor. It is portable and easy to use
in any environment, including businesses. But we also need to meet
the demanding requirements of enterprise workshops. Earlier this year, we announced
the new Oculus for business platform. It offers tools to deploy and manage
large-scale VR programs. You told us that the solution
needed to be scalable, professional and reliable. So that’s what we have built. To start, companies need to deal
with a large number of headsets. Hundreds, thousands of them. Our Device Setup app allows you
to provision many devices at the same time, fast and easy. From there… From there, the Device Manager
is the control center where you can remotely manage
all the headsets and applications. You can also monitor and change settings
and control software updates. For many people, work will be
the first place where they experience VR. And we wanna make it
as easy and productive as possible. Very soon, as you have heard,
we will have hand tracking. A very intuitive way
for people to navigate in VR. No need to teach people
how to use controllers. We have also designed
a new look and feel in the headset. You will get a sleek new app launcher,
with a simplified menu where only the applications that are approved by the company
will be visible. And the headsets can also launch
directly into a predefined application. Now, these user and admin features
are only part of the solution. Your customers need to know
that their data is secure, their interactions are private
and the service is dependable. Facebook has already made headway
in the enterprise with Workplace,
our collaboration solution. With more than two million
paid business users, they serve
some of the strictest industries like healthcare and banking. The data belongs to the companies and is handled
with enterprise-grade security standards. We are building on the same infrastructure
with the same principles. Companies also need to know… Companies also need peace of mind
when their business is on the line. So, we are providing enterprise-grade
customer support. It includes a new help center
with a dedicated team to keep operations running. We know that there is a lot more
that we can do so we are not stopping here. We have a full roadmap coming.
Including… An ISV Partner Program. ‘Cause we want you to get from opportunity, to pilot,
to deployment, faster. We want you to scale your own business. Oculus for Business
is in a closed beta now and we are launching this November. Oculus for Business is the first step
towards a VR-powered future of work. A future beyond the limits
of a 15 inch laptop screen, where new collaboration tools
will make us more productive. And your co-worker thousands of miles away will feel as though
they are right there next to you. A future where our physical realities
will no longer limit our potential. And now, to bring us fun…
To bring us back to a fun and exciting present,
let’s welcome Jason Rubin. Almost every Connect,
I’ve been up on stage talking about VR’s coming moment. That moment where it goes from R&D
to fully baked. From promise to reality. Well, I’m never gonna step up
on this stage and talk about VR in the future again. Because VR is happening now. About four months ago,
we launched Rift S and Quest and the response has been incredible. Again, consumers are buying them
as fast as we can make them. They’re returning to VR week after week and they’re buying and experiencing
and loving a lot of content. In fact, as Mark said earlier,
consumers have spent over $100 million
in the Oculus store to date. $100 million. And Quest is accelerating that. We’re seeing the first hockey-stick bend
in VR’s adoption curve. So now, if a consumer asks us, “Is this the time, should I buy into VR?” The answer, without hesitation, is yes. We have an incredible amount
of great experiences for you. And we’re building more every day. You have questions about our hardware?
Read our reviews. If a developer asks, “Is this the time where I should be
developing for VR?” The answer is also absolutely yes. We’re building the userbase you want. We have the ecosystem you need. We have an incredible form factor. We’re gonna keep pushing
the technical limits. And together, the developers
in this room and Facebook are building the content of the future. We’re doing this together. Now, this is usually the time in Connect
where I would pivot and I would start talking about
the incredible games and experiences that the developer community
is gonna bring to our platform this year. I love doing that. But as many of you know, Oculus has a new Head of Content,
and I have a new role. So this year,
I have a very short time on stage and he gets to come out here
and have all that fun with you. Please welcome Mike Verdu. Thanks, Jason. So, yes. VR is now. And, “The future has arrived, “but it is just not
very evenly distributed.” One of my favorite quotes
from William Gibson has never been so true
for virtual reality. How many of you remember that? It’s a Rift DK 1. The future arrived for many of you
when you first put that on. And VR, it’s been coming to life
around us ever since. And for a while, back then… It seemed like the future would
break over the world like a wave and it would carry us with it
into the promised land. But that’s not how it happened. It’s taken years. The future came from people,
one at a time. Fast forward to 2019. There are millions of people
entering VR regularly and we’ve seen the first VR-only,
platinum hit selling more than a million units.
I mean that’s… that’s awesome. VR titles are now racking up sales numbers
that are respectable in any other medium. At last, the future has arrived
at enough scale that we can actually make a living there. And that’s why we can finally say,
VR is now. This is not happening by magic. The future is arriving
because of your creativity, your passion and your hard work. I’ve been around the gaming industry
long enough to see the birth of multiple platforms,
and what made these platforms succeed? Developers and creators like you. So, look around at your fellow pioneers
who have carried the faith. You’ve been waiting for the world
to catch up with you, and it finally has. You are not early anymore. You’re in the right place
at the right time. And from all of us here, or your partners, we wanted to say thank you. Thank you for sticking with it. Now, let’s talk about Oculus Quest. ‘Cause Quest is bringing the future
to a whole lot more people. As you’ve heard and as you’ve seen, no PC, no tethers, no external sensors and the time to fun measured in seconds. Quest is off to an incredible start. All thanks to the experiences
created by all of you. Your hard work is paying off. It’s paying off in the metrics, that matter to your business. For example, the average attach rate for purchased apps on the Quest, higher than any of our
other platforms. In fact, in the first month of Quest, more than 80% of Quest buyers
purchased content. It’s paying off for developers like the Superhot team, who sold more than 300% more copies for Quest
than their RIFT launch. It’s paying off for Beat Games, who made that first platinum hit. Their Quest launch has been phenomenal. Their success is letting them land
deals with the likes of Imagine Dragons. And they are on track
to deliver more original music and major labels alike. Please join me in celebrating
their success by giving a warm welcome to Vladimir Hrincar and Jan Ilavsky from Beat Games. Thank you, Mike. And think you Beat Saber community
who made this all possible. We are so grateful. We initially made Beat Saber
just with three guys. To our surprise,
it became quickly a viral hit played by more than million players all around the world. It shows that a successful VR game
can be created even with no budgets and no big teams. Virtual reality has so much
unexplored potential. It can be simple and engaging concept, which is inclusive and enables players
to feel like superheroes. This May we’re proud to be one
of the launch titles for Quest and a month later we showcase Beat Saber
with 360 degrees levels. And we actually think that this new mode might be the best way
how to play the game. And it will be available
for everyone this December. I know that it’s a bit far from now. But we actually have
something great you will enjoy much sooner. Yeah, that’s right. We are announcing
the new music pack. It’s Panic! At The Disco. And it will be out next week. We can’t wait for you to try it. -Thank you.
-Thank you. Vlad and Jan have brought so
much delight to so many players. As our other developers, who are
finding success in a variety of ways. They create, they refine. They build a loyal fan base and they deliver value with every update. Just look at Bit Planet Games. They’ve build a franchise in VR. They shipped Ultrawings across
a variety of VR devices over the years, added content through updates, and they’ve built a community. Vertical Robot. They cut their teeth on the
unreal engine for Oculus Go. That experience paid off in Red Matter. It’s one of the best
looking games on Quest. In just one week… In just one week on Quest
they managed to double their lifetime revenue from Rift. And then there’s Stress Level Zero. They’ve been shipping VR game
since the early days, each title bolder than the last. Their upcoming game, Boneworks, is an impressive display of physics and the use of touch. And we are pleased to announce that a new title
in the Boneworks universe is coming to Quest next year. We think gamers on Quest have a ton
to look forward to this year and next. Thanks to the ingenuity
of the people in this room. Here’s just a peek at some
recent and upcoming titles. There are more than 100 games
and other products slated for Quest by the end of the year. And even more coming in 2020. Speaking of things to look forward to, one of the definitive experience
is on Quest has been ILMxLAB’s, Vader Immortal. You’ve all had that moment
in Vader Immortal when you come face-to-face with
the Sith Lord himself for the first time and you look up, and you feel like he’s
actually there with you. Presence isn’t just about the
feeling that you are in another place. Lord Vader reminds us, it’s about feeling the presence of others.
Even virtual characters. We’ve been delighted with the
reception for Vader Immortal. These are press and reviewer quotes. But our users love the experience as well. Our most upvoted user reviews says, “I still can’t wipe this stupid grin off
my face, I can’t wait for episode 2.” The team at ILMxLAB has been
hard at work on the next chapter. To tell the more, I’d like to
introduce Jose Perez III and Alyssa Finley. We are excited to be here on
behalf of ILMxLAB, Lucasfilm’s immersive
entertainment studio, to talk about Vader Immortal Episode II. Yeah. The latest episode of this
groundbreaking VR series picks up right where the previous one left off, with Darth Vader teaching you the Force. One of the virtual realities
greatest strengths is the power of presence. The ability to be transported
into different worlds and to connect first-hand with characters
you’ve previously only seen on the screen. To make the transition from
storytelling to story living. At its core, Vader Immortal explores
one central question. What would it be like for you
to be the hero of the Star Wars adventure? And for you to find out
first-hand Darth Vader’s mysterious plans. Episode one transported you to Mustafar, and it told in a story about an unexplored
period in Darth Vader’s life. As the story unfolded,
you discovered your own lineage, how that tied into Vader’s plans. Plus, you trained with the lightsaber
in both the episode and in the action-packed, Lightsaber Dojo. In episode two, you’ll get to explore
beneath Vader’s castle. You’ll get to use the “Force.” And there will be some incredible
plot twists along the way. For a project this ambitious, we knew it would be important
to collaborate with the right creative partners. The series brings together Maya Rudolph to voice ZO-E3. An award winning writer, David S. Goyer. In addition to our incredible
team of digital artists, producers, engineers, and designers. In developing Vader Immortal Episode II, we wanted to build upon the foundations
in our first installments from both a story
and interactive perspective. In this episode, you’re going to
discover more about your connection to Mustafar and the mysterious
Black Bishop as you traverse
the ancient Corvaxian Fortress. New creatures, like the Darkghast, lie and wait below the planet’s surface. She’s gigantic, she’s mad,
and she’s hungry. You’ll gain a powerful weapon,
the protosaber. This was designed
by our teams at Lucasfilm. It is in an ancient relic that plays
a central role in the story. In addition to the narrative experience, we also created Lightsaber Dojo II. This new dojo allows you to hone
your skills with a lightsaber. And unleash the power of the
Force on your opponents. We are super excited about the
new mechanics in the latest dojo. Using only a single button
you will use the Force to stun, pickup, and fling your opponents
across the arena. You can even attack enemies from
a distance by straight-up throwing your lightsaber
at them. Which is super awesome. There is… There’s genuinely nothing
like using the Force and virtual reality. When we were here
last year, we showed you the first trailer for episode one. Today we are delighted to debut the launch trailer for Vader Immortal
Episode II. Let’s check it out. The Force is more powerful
than you can possibly imagine. Follow me and I will instruct you
in its ways. You will need the Force
if you are to survive the path ahead. Your anger makes you stronger. Incredible, Captain.
How are you doing this? You can unlock the secrets of life. And death. Defend yourself. Impressive. Catch me up, I don’t understand
why you never mentioned that you had the “Force.” Don’t move. That’s right. Vader Immortal
Episode II is available right now on the Oculus store for Quest and Rift. So, Quest, has had a terrific few months, but PC is still the place
with the highest fidelity VR games. The launch of Rift S earlier this year
continues that legacy. Rift S combines convenience
with the raw power of PC. We launched Rift S to bring
more people into VR and we are excited to see
that it’s actually working. More than 60% of people activating Rift S are completely new to Oculus. The state-of-the-art for high-fidelity
gaming continues to evolve. With the unrivaled power
of PC and the Rift platform, gamers can look forward to
two of the biggest VR games yet. Asgard’s Wrath and Stormland. These are incredible titles that show that VR is now home to gorgeous
full-length games that have all the richness and depth that you would expect from blockbusters
on any other platform. For RPG fans, wanting a VR fantasy game
to sink their teeth into, Asgard’s Wrath offers
more than 40 hours of gameplay and even more if you’re a completionist. And Stormland… It offers near endless replayability, thanks to its co-op gameplay mechanics
and its ever-changing world. We’re pleased to announce that Asgard’s Wrath and Stormland
are launching this fall and they are available for pre-order
right now. And everyone here, you’re getting these games for free! And if you can’t wait until launch
to play them, they’re on the show floor. We wanted to end on something special. Yeah. Sometimes an opportunity to work
on a remarkable project comes along and two years ago,
we announced a partnership between Oculus Studios and Respawn,
creators of the hit games Titanfall, Apex Legends and the upcoming
Star Wars Jedi: Fallen Order. The team at Respawn has been hard at work, leaning in to the creative potential of VR
to make an amazing game. One that simultaneously
leaps into the future, but also takes Respawn back to its roots. Today, they are ready
to reveal what it is. Not only does it push the action genre
and VR forward, but it is a humbling example
of what VR can do for games, for players and the way we look back
on history itself. Take a look. With Medal of Honor: Above and Beyond, I think we’re finally able to… To fully realize that vision from
20 years ago of putting you in the boots, allow you to see through the eyes
of someone who is actually there and that’s been the most exciting
and fulfilling part of the project. There’s no moment in this game where,
you’re just relaxed. You’re in a warzone. You are a member of the OSS,
the Office of Strategic Services. That allows us to tell a story
where the player gets to go to all these incredible places. VR had such an other level to it
just because people want to naturally interact with
things, how they do in the real world. Whether you are inside a Sherman tank
or at an airfield, or sneaking into a Nazi sub-base, the level of immersion is unprecedented. Through the power of VR,
you’re climbing a wall, you’re pulling out your gun, you’re catching a grenade in mid-air
and throwing it back. Those are all things
that you end up doing yourself. It’s exciting and it’s terrifying
and it’s fun and it’s cool, kind of, helping create
these experiences for people. We are making a game
about the reality of World War II. So for us, you know,
the authenticity was everything. When you put that headset on, you feel like you’re back
in World War II era, Europe. It’s a very surreal experience
to be there without being there. You’re not watching history
on a flat-screen. You’re experiencing history
with your own eyes. We have opportunity
to sit down and interview combat veterans from World War II. It’s very emotional
and it’s very inspiring. And that’s what draws us to dive deep
for our characters. And putting you as the player
on an adventure, that spans some of the biggest moments
of the war. Medal of Honor franchise
immediately gives us something more intimate, more cinematic. Games are at their best
when they’re immersive and we’re finding
all sorts of little things we can’t do in traditional games. Having the backing of Respawn and Oculus, gives us the ability
to push those boundaries in ways that it hasn’t been pushed before. When you see someone play it, when they just lose themselves
into the game, it’s been pretty impressive. If the game we’re making
ignites people’s imaginations, and illuminates the world, then we’ve accomplished
what we set out to do. Okay, this is gonna be fun. Today’s VR is super-cool.
No question about it. But now, I’d like to talk
about the future. Especially, since this has been
a magical week for me in that respect, thanks to two wonderful additions
to my world. The first is the Control Labs team, who’ll be joining Facebook Reality Labs to help build the interface of the future. Their EMG technology
has exciting potential for delivering natural and intuitive
interaction for VR and AR, and I’m delighted to welcome them to FRL. The other is the birth
of my first grandson, which has made the future
much more tangible to me. Thank you. It was actually
way more moving than I expected. Sitting there with him in my lap at that
first night with everyone else asleep, I found myself looking
at his beautiful tiny face and wondering, what the amazing, unimaginable future
he’d live in would be like. And of course, the one thing
I was sure of, was that VR and AR would be
a transformative part of whatever was to come. That transformative VR future
is what I’m gonna talk about today. But the past and the present
are the keys to the future. So let’s start by travelling back
to the early 1980s, when I was a grad student
at Energy Management and Policy at Penn. As you can see, I hadn’t yet discovered
blue button-down shirts, or for that matter, personal grooming. What I had discovered
was the personal computer, in the form of a vector graphics VIP. Vector sported a four megahertz processor
and a full 56-K of memory, which was hot stuff in those days and I instantly fell in love
with the freedom of having an entire computer to myself. Back then, the world we live in
was just being born. The IBM PC didn’t yet exist. VisiCalc could ship for the Apple II
the year before. So the first killer app which was starting
to make progress with early business adaptors, but there were only about two million
personal computers in the world and most of them were owned by hobbyists,
gamers and enthusiasts. In short, back then, personal computers were more of a novelty
than a real thing. But the more I used the Vector,
the more I began to believe. So when the IBM PC came out, I walked away from my PhD program
to write video games like this. I appreciate that because you have no idea
how hard it is to make that happen with a few thousand
instructions per frame. Writing those games was about the most fun
I ever had. Although my new direction
caused considerable concern among family members who thought
I was throwing away my career. That was an entirely reasonable opinion. Because in 1982, personal computers hadn’t really
started to change the world yet, but they would. And because I had the faith
to be there at the beginning, before everything went big, I’ve had the opportunity
to contribute in many ways, from games to operating systems
to graphics and more. Never for a moment have I or anyone else
I know from those days regretted the decision to dive in
while it was still early days and help make the future happen. Unless you lived through it though, it’s hard to comprehend
just how much things changed as a personal computer
revolution proceeded. Here’s what human-oriented computing
looked like, when I wrote Cosmic Crusader. Then, here’s what it looked like while John and I were working on Quake
15 years later. And finally, here’s what human-oriented
computing looks like on my work phone now. That is how much
a truly revolutionary platform can change things over the years. Okay. Back to the present. Let’s map today’s VR on to that same
long-term trendline. VR certainly has as much long-term
potential as the personal computer. In fact, I believe
it will ultimately become the most powerful creative
and collaborative environment that has ever existed, as I’ll discuss later. But realizing the full potential of VR
will take decades, just as it did with the personal computer. Right now, I’d say VR
is clearly farther along than the personal computer was,
when I wrote Cosmic Crusader, with Quest and Rift S, a broad and varied app portfolio
that includes a million seller and rapidly emerging
enterprise applications, all growing nicely. VR is in a good place right now and it’s easy for us true believers to see where the trend line is headed
in the long run. But realistically, we are still
pretty close to the beginning of what’s going to be one of the great
technological revolutions of all time, which is actually awesome. It means that right now,
we are in my most exciting place. Most of the good stuff is yet to come and it’s our community
that’s going to make that happen. But it often means, the VR is advancing
on two different timescales. In the near term,
the next five years or so, VR needs to grow as rapidly as possible. And that’s happening in a big way between us getting great headsets
out there and a strong ecosystem built and all of you creating the applications
that will take VR… As Jason Rubin said,
“VR is happening now.” At the same time,
there’s another future after this one. The next big step
at that one term trend line, a quantum leap
to a whole different level that will be built
on the work all of us are doing today. When that future arrives, VR will explode the way the personal
computer did nearly 40 years ago. VR hasn’t hasn’t changed
the world yet, but it will. Of course, I’m preaching to the choir. You’re working on VR precisely because
you believe it will change the world. The interesting question is, “When?” Well, I have some good news
and some bad news. I’ll start with the bad news. I’ve made specific predictions about
the timing of that quantum leap twice before at Oculus Connect,
and both times, I was too optimistic. This year, rather than making
yet another prediction, I’m going to involve Hofstadter’s law. It always takes longer than you expect, even when you
take into account Hofstadter’s law. The honest truth is, I don’t know when
you’ll be able to buy the magical headset
I described last year. VR will continue to evolve nicely, but my full vision for next generation VR
is going to take longer. How much longer? I don’t know.
Let’s just say, not anytime soon. Turning breakthrough technology
into products is just hard. So, that’s the bad news. But then, there’s the good news. That quantum leap
into the future is coming, and we at FRL are making it happen
as fast as we can. What exactly will that future look like? If we’re talking about the long term
future, that’s easy. VR/AR, is, in my opinion, going to be the most significant
technology of the next 50 years just as personal and mobile computing
have come to dominate our lives. Over the 46 years
since the first personal computer, the Alto was built
at the Xerox Palo Alto Research Center Xerox part started a revolution that
ultimately led to every one of us either interacting with,
or being seconds away from the virtual world
almost every minute. That virtual world has touched nearly
every corner of our lives, but there’s one great limitation. We interact with it almost exclusively
through two dimensional interfaces along with very limited audio. If you think of a human as a CPU
with memory, input and output, admittedly, not the most
romantic framing, but accurate. Then it becomes clear that the data
received on the inputs are sensors, and the actions induced
by the outputs are motor controls, must define the full range of experiences
that we can have in the world. Our lives are enhanced if we can bring
more useful information to our sensors and perform actions
that have more useful effects resulting in better,
more satisfying experiences which is precisely why virtually every one
has a smartphone right now. But relative to what you’re capable of, that smartphone is a very low
bandwidth channel. In contrast, VR and AR
have the potential to give us all the bandwidth we can handle
in the ways we’re built to use it and that will let us do more of what makes
us human, especially socially. That’s the fundamental reason I believe
VR/AR will be the dominant technology of our lifetime across
the full sweep of our digital lives. Games, of course, but also
so much more, as I’ll touch on later. So to me, it’s easy to predict
that awesome long-term future. Predicting the next generation
of VR is harder. I’m happy to share my thoughts with you, but there are all sorts of opinions about
what the future of VR will be, and opinions are dime a dozen. The most useful way to learn something
meaningful about what’s coming, is to live by Alan Kay’s great quote, “The best way to predict
the future is to invent it.” That’s what we’re doing at FRL, and I view these talks at Oculus Connect as our early postcards
from the future of VR. Today’s postcard,
we’ll share some recent progress we’ve made on inventing the future
in optics, machine perception and avatars. What I’m about to share is
actual working prototype technology, not mocked up demos or concept art. Let’s start by following up on varifocal
and the Half Dome prototype. Last seen at F8, more than a year ago. The headset we shared at F8,
was the result of several years of research and prototyping
of advanced display systems. Half Dome was our first prototype
to achieve two milestones. First, using Fresnel lenses,
it supports a 140 degree field of view. Second, by physically moving
the screens based on eye tracking. It changed the focal depth and kept the
image sharp when inspecting close objects. Today, I’m pleased to be able to share
a new varifocal concept prototype, Half Dome 2, built by our
Display Systems Research team working closely with several
other teams across FRL Unlike the original Half Dome,
Half Dome 2 is targeted primarily at ergonomics and comfort,
both visual and physical. The new product is substantially
is smaller and wider than Half Dome largely because our optics team has
managed to fold the optical path into a very small volume. Overall, we’ve been able to
improve form factor substantially and reduce weight
by a full 200 grams over Half Dome. The tradeoff
for that increased comfort is that the field of view
is narrower than Half Dome although, still 20% wider than Quest. The varifocal hardware has also
been considerably improved So, let’s take a quick look at that. Varifocal now relies
on voice coil actuators and flexure hinge arrays,
eliminating any points of sliding or rolling contact between the moving
screen and the part assembly. This improves on the original Half Dome
by reducing noise and vibration to imperceptible levels. All in all, Half Dome 2 continues
the trajectory of Half Dome toward more immersive
and comfortable VR displays. But, there’s more. As I said earlier, we’re inventing
as fast as we can, so I’m delighted to be able to share
our first electronic varifocal system, Half Dome 3. We’ve replaced all moving parts
in Half Dome 2 with a thin stack
of liquid crystal lenses. Let’s take a look at a prototype module to understand
how electronic varifocal works. The next few images will be recorded through the electronic varifocal module
you see here. This is a real camera shot. Each liquid crystal lens
can be turned on and off to alternate between two focal states. Here, we indicate that the lens
is on by highlighting it in orange. When the lens is turned off,
the focus shifts to the far object. And then, when the lens turns back on, it shifts to the near object again. As you can see,
a single liquid crystal lens makes a great pair of digital bifocals,
shifting focus between two depths. To achieve smooth varifocal, we address
the full stack of liquid crystal lenses with each additional pair
doubling the number of focal points. In this example, six liquid crystal lenses are driven to sweep
through 64 focal points and you can see the focal depth smoothly
changing at the right as we cycle through
different sets of lens states. In addition to having no moving parts, this approach allows significantly better
form factor compared to its predecessors. Here, we compare our new electronic module
to the original Half Dome assembly and see that there’s a considerable
reduction in size. When we integrate the electronic module
into a complete prototype headset, it defines a new state of the art
for new design ergonomics. This is still very much researched today, but here’s a view through an early
Half Dome 3 prototype. As you can see, without varifocal,
the cassette gets blurry up close. But the electronic approach
is able to replicate the smooth varifocal experience
of mechanical systems at all depths A promising sign for the future. If you’d like to know more about
our varifocal work, we’ll be putting up a deep-dived
blog post today. As important as visual quality is, a great display is just one key component
of a compelling VR system. Computer vision, a part of what
we call machine perception is another essential element
of the VR experience. For starters, the ability to localize
the headset in the real world with great reliability and accuracy,
so that virtual objects are rock solid is what makes a virtual world seem real. That’s not all though. The ability to detect
and reconstruct the real world and import parts of it as desired
enables mixed reality. That is, mixing and matching
of real and virtual VR so that you could, for example, bring a real keyboard, mouse and desk
into your virtual workspace and use them as easily
as you would in the real world. Reconstruction also enables
social teleportation which lets you share surroundings
with another person or jump into someone else’s part
of the world as we saw in Boz’s talk. And of course,
it’s the foundation of live maps. Let’s look at some of the recent progress
from our surreal team on reconstruction. I want to emphasize
that what you’re seeing is the real thing with no smoky mirrors. This is a flythrough
of reconstructions of real spaces generated by state of the art
reconstruction technology. I think this is absolutely amazing. If you can imagine stepping into
one of those and and just being in it, this phenomenal
level of reconstruction will enable extraordinarily high quality
mixed reality experiences. Once with a reconstruction,
you can do whatever you want with it. Here are some pretty cool lighting effects
in the reconstruction of Boz’s dad’s den. Rock solid mixed reality is the key part
of the future of VR, but there’s something missing
and that’s people. We need that truly convincing avatars
to make VR ubiquitous because as Yaser Sheikh from
our Pittsburgh lab puts it, “Proximity determines
social relationships.” and social relationships
make the world go round. Being together in person will always be
the most human way to interact, but the technology we’re developing has
the potential to be the next best thing. Let’s take a look at where
we are with some of that. I’ll start by picking up
where we left off last year with the codec avatar had shown next
to actual video of Steven speaking. Good morning to you, my boy. It’s healthier to cook without sugar. “Thank you,” she said,
dusting herself off.” That’s amazingly realistic, but the level of expressiveness
is pretty low, and expressiveness is perhaps the most
critical part of social interaction. One of important advances last year
has been capturing more of that. as you can see in this next video. Here, Kevin’s facial animation is being
driven in real time entirely by cameras
mounted in the headset. Of course, facial expressions only matter if you’re
interacting with someone. As you watch the next video,
you can see Yaser and Danny wearing headsets in the real-world video
in the corners. The faces in the center are avatars and they are animated in real time
entirely from cameras in the headset. What can your face do?
Can you show us? I’ve always hoped
you would have asked me that question. So, I have
some pretty good mouth movement. How about your eyes?
Can you look left, right, up down? Good, good. Yeah. I can do surprise. I think one of my favorites is
puffing my cheeks. The mouthwash commercial. And rolling my tongue. -That’s pretty good actually.
-Yeah. Pretty hard to believe
that those aren’t videos, huh? Although, to be fair, right now
it doesn’t always work that well. Anyway, it certainly doesn’t
take a huge leap to imagine how doing that in VR could be far more
satisfying and personal than seeing someone’s
image on a flatscreen. Social teleportation will need
more than just heads though. So I’d like to share an initial step
towards Codec Avatar bodies with you. A Codec Avatar is
a digital representation of a person. And we build these to look like us,
move like us, sound like us. The purpose of it is to be able
to connect people across distances. Again, this is just a first step. What you saw was generated offline,
not in real time. Although it will become real time
in the not-too-distant future, but it’s promising. All this is still research
a lot left to do, but Codec Avatars
have certainly come a long way. I think the three research areas
we just looked at are pretty cool, but where this really gets exciting
is when the pieces start coming together. If you put Codec Avatars
together with reconstruction, you start to get true social teleportation where anyone can be with anyone else
wherever they want working with a mix of real and virtual. Let’s take a very early peek
at what that would be like. To set the stage… This is the stage in which
Codec Avatar data is captured initially. This is where we put people in,
capture information and now we’ve done a reconstruction
of that space so that now,
we can put a Codec Avatar in it. …we wish to spend time with and not be restricted only to
those people who live close by. That’s a promise of it, and this is research work we’ve done
right here in FRL, Pittsburgh. I hope you enjoy the demo
that you’ve seen. And we all look forward
to what the future brings. Thank you. Again, there’s a long way to go, but this is a genuine glimpse
of the future. It’s going to take all
the innovative technology we just saw and a lot more to make the
leap to the next generation of VR. But there’s more to it than that. There’s an important lesson
that we can learn from Xerox PARC here. The PARC researchers invented a wide range
of revolutionary technologies. From the laser printer
to the bitmap-windowing interface to WYSIWYG word processing
to object-oriented Programming. If that was all they had done,
they would’ve been wildly successful. But they also integrated all of that
into the Alto and that is what changed the world. Similarly, new VR technologies
will need to be woven together into a complete
tightly-integrated platform in order to make that quantum leap. It’s the sum of the parts that will
deliver that breakthrough experience, not technologies in isolation, as we start to see
when we put the Codec Avatar in a reconstructed environment. Which brings us to the question
of exactly what that integrated
next-generation platform will be. Rather than starting with technology, I will approach that question
from the perspective of of what overall user experience
we’d want the platform to deliver. And here I’m going to return to a theme
I’ve talked about before, my desire for
a virtual collaborative work space. For five years now, I’ve been
wishing for a VR work space that I could configure any way I wanted with monitor quality virtual screens,
holograms, whiteboards and wherever. Saving and switching between
configurations with a click. Throw in the ability to interact
with my real surroundings, and use a keyboard mouse and that
would be a great work environment. Then if I could share virtual spaces
with other people, it would become an amazing productive
collaborative environment as well. And while I’m at it,
it’d be great to have the ability to manipulate both real and virtual
objects with my hands complete with haptic feedback. I would use that in a heartbeat and I believe
that it would spread like wildfire the way personal computers did
back in the day. Even better, the hardware-software stacks
and the overall platform needed to enable the full range of uses
for that virtual collaborative workspace are so broad in general that they’d enable a vast space
of applications and a flood of creativity across Gaming, Entertainment,
Communication, Education and Productivity. Again, just like personal computers did. I am highly confident that
a great virtual collaborative workspace would open the door to
the entire next generation of VR which in turn would unlock
human potential on a massive scale. What would it take to make that
collaborative work space a reality? Without question, we’d need enough
resolution and good enough image quality so that virtual monitors
were at parody with real monitors. That would require very high res displays
and much improved optics, I personally think visual acuity
needs to be 20/20 or better, but that’s just the start. We’d also need the ability to render
at that high resolution and to either do that with the mobile GPU
or transmit a data over a wireless link in that very likely means
we would need foveated rendering which may mean we we need
a new graphics pipeline and would certainly mean
we need great eye tracking. Next, we would need excellent
real-time mixed reality so we can be aware
of the world around us and move about and interact with our desk, chair, keyboard, mouse
and other surroundings. We would also want to have persistent
shareable virtual objects in the world so that we could, for example,
set up a customized team work room or work on tech art
or some code with a team mate. To do that, we would need a localized
version of the live maps technology that Boz talked about. That is, a private live map
of our local, physical surroundings. We would also need
to be able to see our hands and body in order to truly be present
in the virtual world and we’d wanna do more
than just look at our hands, we’d want to use them as the intuitive,
highly dexterous manipulators that they are in the real world. So, we’d want both haptic gloves
and hands so accurate that we can interact with both the real and virtual worlds
flawlessly in mixed reality. If we’re gonna be doing work
with our hands, we’d want clear, comfortable vision within
arm’s length for hours of use per day. Exactly what varifocal
is designed to deliver. We’d also want proper spatialization
and propagation of virtual sounds so virtual objects and spaces
would sound as real as they look. For collaborative work we’d obviously want the compelling avatars discussed earlier
and we need accurate real-time face, hand, eye and body motion
as well as highly realistic appearance. Less obviously, we’d need a wider
field of view so that everybody in a meeting
could see everybody else. That’s essential for social interaction,
as are voices that sound like they’re coming from
the right people in the right places. We’d want to be able to share
our real environments with each other both for social purposes and because physical objects
will often be important to the discussion. And we’d want to wrap all this up
with great ergonomics to make it comfortable to be in VR
for hours at a time. That would require making
everything I’ve discussed compact and power efficient. And then, we’d finally have
the complete platform that my dream workspace could be built on. Which I think we can all agree
would be pretty awesome. But how do we get from here to there? Well, FRL’s been pushing the envelope for the last five years
on everything I just talked about. We’ve been focused on
bootstrapping individual areas, like the hand tracking Mark talked about
which originated as FRL research, because that’s what it took
to get to next generation technology. Now that we’ve built
many of the pieces though, it’s time to start putting
the full platform together. To paraphrase Alan Kay, “The best way to predict the
next generation of VR, is to build it.” And FRL’s mission
is to build time machines that let us peer as far as possible
into the future. So, just as Xerox PARC built the Alto and showed the way
to the future of the personal computer, FRL is going to build a true
next-generation concept prototype with the objective of showing the way
to the long-term future of VR. This prototype
is going to be aimed squarely at collaborative virtual workspaces with the objective of enabling
fully remote work that has all the benefits
of working in an office, in addition to the pluses of both
virtual and remote work. One reason we’re making remote work
the north star, is because it’s a great way to connect
people in VR and connecting people
is what Facebook is all about. Another reason is that making remote work
really effective would have a hugely positive effect
on how we live. Imagine what it would mean if you could work remotely
as effectively as in person. Or, someday, maybe even more effectively, because you could have collaboration tools
in the virtual world that could never exist in the real world. People wouldn’t need to live
near where the they work, they could live near family
or where housing is affordable or just wherever appeals to them. Commutes would vanish and there would be
dramatically fewer business trips with huge and energy
and environmental benefits. Companies would be able to tap
into talent around the world and the ability of talented people
to find meaningful high-paying work would no longer be determined
by where they happened to have been born. That is truly a feature
that defies distance and that by itself
would be more than enough motivation. But there’s yet another reason
to build this prototype and that’s that a collaborative virtual
workspace is something we, in FRL would actually use And when you’re inventing something new, it’s critical to know the customer well
and have a rapid feedback cycle with them and how better to do that
than building something for yourself. So, the key metric for our prototype
will be whether we choose to use it for hours a day to do real work. If we find it that useful, so too will millions
of other information workers. Most likely, including you. The long-term potential of next-generation
VR’s tremendously exciting, but I wanna be really clear
that this is still a high-risk research. At best, it will take years to get to a prototype
that proves out the concept. And Hofstadter was right. “It always takes longer than you think” and that’s especially true for turning
research into something useable. So, there’s exciting potential for
that quantum leap down the road, but for the foreseeable future,
VR will be all about the Quest generation. Again, VR is happening now. Anyway, however long it takes to build
that compelling next generation prototype, we will keep at it until we get there. Now again, this will be a prototype,
not a product. You won’t be able to buy it. It won’t be sleek or affordable
or highly manufacturable or as durable
as a consumer product needs to be. It will surely come up short
of my wish list in some respects. That’s okay. It just needs
to be good enough to show the way
to the next generation of VR. The Alto was limited, slow and expensive
and it never became a product, but it showed that the personal computer
could empower people to do new and highly
compelling things and that was enough
to point the way to the Mac and Windows and everything that followed. So the virtual collaborative
work space is just a starting point. A catalyst, like the personal computer, the next generation of VR will be
a broad general platform that enables a vast range of uses. Some of these will be improved
versions of familiar things, but many of them
haven’t even been imagined yet. Whatever those applications are, I’m confident that
the next generation platform will make VR a part of daily life for tens
and hundreds of millions of people and will begin to change the way we work,
the way we connect, the way we live. So, VR is going to shape
the world my grandson will live in, but it’s going to be a decade’s
long journey to that promised land. And will take all of us to get there. The next leg of that journey will be built on the work
all of you true believers are doing, exploring the possibilities of VR
and taking it into the mainstream while we work to move
the underlying platform forward. And whenever it is that that next generation platform
does finally see the light of day, you will be the ones poised to use it
to change the world and your faith in these early days
will be rewarded at a scale that’s hard to imagine today. Cosmic Crusader was a labor of love, it only sold a few thousand copies
so I certainly didn’t do it for the money, but it was the start of the path
that 15 years later led to co-writing a game
that tens of millions of people would play and that would be the template
for a whole genre that’s still going strong
more than 20 years later. You too will have that sort of experience
one day with VR. In the meantime, we’re all going
to have the adventure of a lifetime. It’ll take a little more time
than I thought to get to the next generation of VR, but that just means that these
are gonna be the good old days for longer. And these truly are the good old days. It just doesn’t get any better
than being there at the start and getting to create the future
and we are all unbelievably lucky to have the opportunity
to do exactly that. The future is waiting just as surely as it was waiting
for this kid 37 years ago. Let’s go make it happen.
Thank you.