Blending Realities with the ARCore Depth API
Posted by Shahram Izadi, Director of Research and Engineering
ARCore, our developer platform for
building augmented reality (AR) experiences, allows your devices to display content
immersively in the context of the world around us-- making them instantly accessible and
useful.
Earlier this year, we introduced
Environmental
HDR, which brings real world lighting to AR objects and scenes, enhancing immersion
with more realistic reflections, shadows, and lighting. Today, we're opening a
call for
collaborators to try another tool that helps improve immersion with the new Depth
API in ARCore, enabling experiences that are vastly more natural, interactive, and
helpful.
The
ARCore Depth API allows developers to use our
depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is
created by taking multiple images from different angles and comparing them as you move your
phone to estimate the distance to every pixel.
Example depth map, with red indicating areas that are close by, and
blue representing areas that are farther away.
One important application for depth is occlusion: the ability for digital objects to
accurately appear in front of or behind real world objects. Occlusion helps digital objects
feel as if they are actually in your space by blending them with the scene. We will begin
making occlusion available in
Scene Viewer,
the developer tool that powers AR in Search, to an initial set of over 200 million
ARCore-enabled Android devices today.
A virtual cat with occlusion off and with occlusion on.
We’ve also been working with Houzz, a company that focuses on home renovation and design, to
bring the Depth API to the “View in My Room 3D” experience in their app. “Using the ARCore
Depth API, people can see a more realistic preview of the products they’re about to buy,
visualizing our 3D models right next to the existing furniture in a room,” says Sally Huang,
Visual Technologies Lead at Houzz. “Doing this gives our users much more confidence in their
purchasing decisions.”
The Houzz app with occlusion is available today.
In addition to enabling occlusion, having a 3D understanding of the world on your device
unlocks a myriad of other possibilities. Our team has been exploring some of these, playing
with realistic physics, path planning, surface interaction, and more.
Physics, path planning, and surface interaction examples.
When applications of the Depth API are combined together, you can also create experiences in
which objects accurately bounce and splash across surfaces and textures, as well as new
interactive game mechanics that enable players to duck and hide behind real-world
objects.
A demo experience we created where you have to dodge and throw food at a robot
chef.
The Depth API is not dependent on specialized cameras and sensors, and it will only get better
as hardware improves. For example, the addition of depth sensors, like time-of-flight (ToF)
sensors, to new devices will help create more detailed depth maps to improve existing
capabilities like occlusion, and unlock new capabilities such as dynamic occlusion—the ability
to occlude behind moving objects.
We’ve only begun to scratch the surface of what’s possible with the Depth API and we want to
see how you will innovate with this feature. If you are interested in trying the new Depth
API, please fill out our call for collaborators
form.