Posted by Evan Hardesty Parker, Software Engineer
ARCore and Sceneform give developers simple yet powerful tools for creating augmented reality (AR) experiences. In our last update (version 1.6) we focused on making virtual objects appear more realistic within a scene. In version 1.7, we're focusing on creative elements like AR selfies and animation as well as helping you improve the core user experience in your apps.
Example of 3D face mesh application
ARCore's new Augmented Faces API (available on the front-facing camera) offers a high quality, 468-point 3D mesh that lets users attach fun effects to their faces. From animated masks, glasses, and virtual hats to skin retouching, the mesh provides coordinates and region specific anchors that make it possible to add these delightful effects.
You can get started in Unity or
Sceneform
by creating an ARCore session with the "front-facing camera" and Augmented Faces "mesh" mode
enabled. Note that other AR features such as plane detection aren't currently available when
using the front-facing camera. AugmentedFace
extends Trackable
, so faces are detected and updated just like
planes, Augmented Images, and other trackables.
// Create ARCore session that support Augmented Faces for use in Sceneform. public Session createAugmentedFacesSession(Activity activity) throws UnavailableException { // Use the front-facing (selfie) camera. Session session = new Session(activity, EnumSet.of(Session.Feature.FRONT_CAMERA)); // Enable Augmented Faces. Config config = session.getConfig(); config.setAugmentedFaceMode(Config.AugmentedFaceMode.MESH3D); session.configure(config); return session; }
Another way version 1.7 expands the AR creative canvas is by letting your objects dance, jump,
spin and move around with support for animations
in Sceneform. To start an animation, initialize a ModelAnimator
(an
extension of the existing Android animation support) with animation data from your
ModelRenderable
.
void startDancing(ModelRenderable andyRenderable) { AnimationData data = andyRenderable.getAnimationData("andy_dancing"); animator = new ModelAnimator(data, andyRenderable); animator.start(); }
In ARCore version 1.7 we also focused on helping you improve your user experience with a simplified workflow. We've integrated "ARCore Elements" -- a set of common AR UI components that have been validated with user testing -- into the ARCore SDK for Unity. You can use ARCore Elements to insert AR interactive patterns in your apps without having to reinvent the wheel. ARCore Elements also makes it easier to follow Google's recommended AR UX guidelines.
ARCore Elements includes two AR UI components that are especially useful:
We plan to add more to ARCore Elements over time. You can download the ARCore Elements app available in the Google Play Store to learn more.
ARCore version 1.7 also includes UX enhancements for the smartphone camera -- specifically, the experience of switching in and out of AR mode. Shared Camera access in the ARCore SDK for Java lets users pause an AR experience, access the camera, and jump back in. This can be particularly helpful if users want to take a picture of the action in your app.
More details are available in the Shared Camera developer documentation and Java sample.
For AR experiences to capture users' imaginations they need to be both immersive and easily accessible. With tools for adding AR selfies, animation, and UI enhancements, ARCore version 1.7 can help with both these objectives.
You can learn more about these new updates on our ARCore developer website.