The new iPhone X front facing camera depth and face tracking mesh API

2.3k views Asked by At

Just watched the new iPhone X announcement, is the sensing and tracking technology of the front camera open to developers? A Snapchat face mask was demoed on stage, not sure if it's using the ARKit

2

There are 2 answers

0
rickster On BEST ANSWER

Yes, it's open to developers.

If you look at the ARKit docs page now, you'll see that it's split into World Tracking and Face Tracking sections (plus some bits in common to both). World Tracking is what was announced back at WWDC — looking "through" your device with the back camera at AR content in the world around you.

Face Tracking AR is specific to iPhone X and the TrueDepth camera. As you can see in those docs, it uses ARFaceTrackingConfiguration instead of the other configuration classes. And it gives you info about the face in real time through ARFaceAnchor objects.

In the face anchor docs, it looks like there are two ways to get face info. The geometry gives you a 3D mesh you can display, or use to map textures onto the face — that's presumably what the Snapchat demo used to make wrestling masks in the keynote demo. The blendShapes give you a bunch of animation parameters, like how far the jaw is open and how squinty the left eye is (and about 50 other, more subtle things)... they talk about using that to animate puppets or avatars, so that's probably how Animoji works.

Apple also posted a sample code project showing how to do all of these, so you can look at the code to get an idea how to do it yourself. (Even if you can't run the code without an iPhone X.)

0
nathan On

Here's an example using the TrueDepth camera on iPhone X: https://developer.apple.com/documentation/arkit/creating_face_based_ar_experiences

And a brand new session: Face Tracking with ARKit #601

Creating Face-Based AR Experiences

Place and animate 3D content that follows the user’s face and matches facial expressions, using the TrueDepth camera on iPhone X.

This sample app presents a simple interface allowing you to choose between four augmented reality (AR) visualizations on devices with a TrueDepth front-facing camera (see iOS Device Compatibility Reference).

  • The camera view alone, without any AR content.
  • The face mesh provided by ARKit, with automatic estimation of the real-world directional lighting environment.
  • Virtual 3D content that appears to attach to (and be obscured by parts of) the user’s real face.
  • A simple robot character whose facial expression is animated to match that of the user.

==> Check the section titled "Place 3D Content on the User’s Face" for your second use case.

enter image description here