Skip to content

Headset Overview

Studio is being deprecated, please head over to the documentation page for Mattercraft, our most advanced 3D tool for the web, where you can find the most recent information and tutorials.

The Zappar app’s headset mode allows users to experience immersive augmented and virtual reality content using their Google Cardboard (or compatible headset). This article outlines the tools and best practices for building headset experiences using ZapWorks.

Experiences built using ZapWorks Studio can activate Zappar’s headset mode using a Z.HeadsetManager node. For more information see the Using HeadsetManager.

Most VR headsets come with limited mechanisms for user interaction. The Zappar app currently supports two primary ways for users to interact with experiences:

  1. The headsetbutton event is fired by Z.HeadsetManager when the user taps anywhere on the screen (other than the exit-headset button) during headset mode. Many cardboard headsets provide a physical window that allows finger access to the screen for this purpose.

  2. The Z.Raycaster node can be used to react to changes in the direction that the user is looking. A Z.Raycaster node placed in a group relative to Z.camera will emit events when the center of the camera points at objects registered to react to raycasters. For more information see the Raycaster Overview article.

Your use case will determine if your users are likely to have a compatible headset to hand while experiencing your zap. Consider building your content to function well outside of headset mode. The headsetenter and headsetleave events of Z.HeadsetManager can be used to tweak your user interface and behavior in either case.

Excessive motion of the camera inside headset mode that isn’t initiated by the user may lead to nausea or dizziness. Try to avoid moving the user inside the experience without their explicit action.