Face Tracking Best Practices

The following best practices will help you to implement Face Tracking effectively in your experiences.

Creating an effective .zcomp

Creating an effective .zcomp will allow you to get the most out of your Mattercraft project. The following advice will go into more detail into how you may want to set up your .zcomp (Scene) for maximum benefit.

Ensure the camera is user facing

When creating face-tracked experiences, remember to switch to the front-facing camera.

This can be done by clicking on the ZapparCamera node, finding its Camera Direction property and selecting User. You may also use States or script.

Alternatively, you can include a functionality within your experience which asks the user to manually switch to the front-facing camera using an interactive node (such as an image or button) in your .zcomp.

Make sure nothing irrelevant is obscuring the face

When you are developing a full experience, it is likely that you will not only have dedicated face tracked content, but also other elements to enhance your project. For example, a face filter may also have:

  • A border
  • A button to take the user's photograph
  • A button to switch the camera's view

Confirm that these elements do not block the face and that it is easy for the user to view the face tracked content.

Guiding the user

It is important to guide the user when they are using Augmented Reality experiences, as this may be a new technology for them. The below factors will aid you to allow users to get the maximum value with their time in your experience:

Provide clear instructions

Clearly communicate to users how they can interact with the AR experience and how to position their face in the camera's view correctly. This can be done on physical print (if applicable), through on-screen instructions, tooltips, or an onboarding tutorial.

Handle when a face is not visible

Clearly convey what the user should do if a face is no longer in view. This can be done by making tracked content screen relative, or by hiding content and asking the user to return the face to the camera view for example.

Consider the user’s environment

Different environmental factors can affect the experience. Try to control (or influence your user to control) factors such as the lighting on a user's face, or how far away the user is holding the phone from them.

For example, a well and evenly lit face is likely to be more stable when tracking than if the face is in a dark room.

zapcode branded_zapcode i