Search Knowledge Base by Keyword
SpatialStories uses its own camera that automatically handles the XR features from the Unity implementation and the toolkit’s functionalities. To benefit from the SpatialStories Camera, all you have to do is convert the standard Unity Camera into an Interactive Camera:
For most platforms (Oculus Rift, HTC Vive, ARKit, ARCore), right-click on the Camera in the Hierarchy Tab and select SpatialStories -> Convert into Interactive Camera.
If you want to have a specific rig for HTC or Oculus with controllers instead of hands we provide prefabs under the Resources folder.
For 3DoF mobile development (ex: Samsung Gear VR), navigate also to the resources folder of SpatialStories. There, you will find a prefab made for a 3DoF configuration. It has been fully tested with the Gear VR. This camera rig simulates a 3DoF controller.
Camera (IO): Try not to modify its structure.
- Left/Right Hand Active: defines which hand(s) to use in your experience. Note: a 3Dof camera only has the right hand activated for simulation purposes using an Oculus or a Vive.
- Track Position/Orientation: defines if you want to track your hands’ position and/or orientation. Note: The 3DoF camera has its position tracking disabled to simulate and test the Gear VR controller effect using an Oculus or a Vive.
- Haptic Audio Clip Min/Max: defines which audio clip to use for haptic feedbacks on controllers who support this functionality. Important: Haptics are currently not supported for VR.
Head (IO): The Head contains the camera (see inside Visuals), it is an Interactive Object (IO) that represents the user in any experience.You can change the size of the proximity collider of the Head (IO) which will define the proximity zone of your user. It also contains a Player Torso object. The Torso represents the approximate position of the user’s torso below its head.
You can change the distance between the head and the torso in the inspector.
Left Hand (IO): Represents the left hand of the player. Here you have access to the Teleportation and levitation options where you can tweak the settings for the left hand.
Right Hand (IO): Represents the right hand of the player. Here you have access to the Teleportation and levitation options where you can tweak the settings for the right hand.
Teleport: To move around in a virtual environment, the user has the ability to teleport. This is especially useful if your experience contains large spaces or if you are using a device with no positional tracking.
By default, the Teleport button is the joystick of the Oculus Touch controller or the trackpad of any other controller. You first need to press for the Teleport Beam to show up. When teleporting is possible, the Beam is blue and you can see a circular target at the end of it. You can move with the joystick or the trackpad to choose the desired rotation and then simply release to teleport to that target.
To set up your zones of Teleport (and avoid to be able to teleport in every collider), select the teleport layer in the Teleporter Settings. Then set your Teleportable game-object (with a collider) to be on the Teleport Layer.
The Teleport Settings are located in the Interactive Camera (Camera IO) rig, more precisely in Left Hand (IO) and Right Hand (IO). If you want, you can choose a specific set of Teleport parameters for each hand and change its design.
Here are the Teleport Details:
- Orient On Teleport: allows the user to orient on teleport by rotating the joystick. The rotating cone shape inside the circle represents the direction the user will look once teleported.
- Target Destination: mandatory ! It is the tip of the arc for the target destination (circle)
- Allowed Destination: defines the color when teleport is allowed
- Not allowed Destination: defines the color when teleport is NOT allowed
- Arc Line Width: sets the width of the arc
- Arc Material: defines the material of the arc
- Allowed Layers: the layers the teleport is allowed on
- Max Distance: to limit the teleport distance [meters] on a horizontal surface
- Max Slope: to limit the teleport distance [meters] between zones placed at different heights.
Hotspots are specific positions for your user to be snapped on when he teleports.
Size: You can define a list of positions (add the number you want )
Minimum detection distance: defines under which distance the teleport beam will snap into this position.
To prevent some usual input errors we provide the following settings.
Hold duration to activate [s]: minimum time for the user to hold on the joystick/trackpad for the teleport to activate
Input sensitivity [0-1]: threshold to detect the input from the user
Cooldown [s]: time to wait between teleport actions (input is not taken into account during this time after a teleport).
Once your camera is converted you will have two elements in your scene that are mandatory for the system to work properly:
Camera (IO): Try not to modify its structure. Inside this object you will find the Head (IO).
Head (IO): it is an Interactive Object (IO) that represents the user in any experience, containing the camera inside its visuals. You can change the size of the proximity collider of the Head (IO) which will define the proximity zone of your user.
Warning: The camera position and rotation should always be at 0, 0, 0; otherwise offsets can occur in your experience.
AR Utilities: This object handles the platform on which you are building. It enables the corresponding elements you need in the scene for ARKit or ARCore to work.
THIS IS A MANDATORY OBJECT, DO NOT REMOVE IT FROM YOUR SCENE. IT ALSO WORKS AUTOMATICALLY, DETECTING THE BUILDING TARGET PLATFORM ON YOUR SETTINGS, DO NOT MODIFY IT MANUALLY.