This User Manual is for v.0.1.0 of the SDK. If using the newer v.0.2.b4, click here.

User Manual (v0.1.0)


Getting Started

Download the SpatialStories SDK

When your subscription has been approved, log in and go to this page to download the SDK. Unzip it to have access to a file with a standard .unitypackage extension.

Download Unity

Make sure that you have the recommended version of Unity installed for the SDK. SpatialStories has been successfully tested with Unity 5.6.2f1 which can be downloaded here. Although Unity 2017 may also be compatible with the SDK, we can’t guarantee it at this point.

Import the SDK in Unity

Open Unity and create a new project (or open an existing one). Go to Assets -> Import package -> Custom Package…, pickup the .unitypackage file you’ve just downloaded and click on Import. This will take a few minutes. Once imported in your project, you can erase the downloaded .unitypackage file from your hard drive.

Input Manager

In order to have access to the inputs of the different manufacturers’ controllers (HTC, Oculus Touch), you need to configure the InputManager. There are 2 ways to do it:

  1. Right click anywhere in the Project Tab and select Repair Input Manager.
  2. Once you have converted the camera into an Interactive Camera (IO), when you hit play, the system will check if you have the necessary configuration in order to use the input system. If you don’t have the proper inputs setup, you will be asked to configure it. Then simply follow the instructions.

Note that your original InputManager file will be kept as a backup in the same location!



The Camera

SpatialStories uses its own camera with the following functionalities:

  • Hands Tracking

    Allows use of your hands

  • Teleportation

    Allows to teleport in the virtual world

  • Player's Torso

    Allows to differenciate your head from your torso

To benefit from the SpatialStories Camera, all you have to do is convert the standard Unity Camera into an Interactive Camera. Right-click on the Camera in the Hierarchy Tab and select SpatialStories -> Convert into Interactive Camera.

Once converted, your new camera will have the following structure:

The Camera (IO) gives you access to the Gaze_InputManager component (in the Inspector).

Left/Right Hand Active: defines which hand(s) to use in your experience.

Track Position/Orientation: defines if you want to track your hands’ position and/or orientation (it may be useful to deactivate these options when using a controller with no positional tracking abilities).

Haptic Audio Clip Min/Max: defines which audio clip to use for haptic feedbacks on controllers who support this functionality.



Teleport

To move around in his virtual environment, the user has the ability to teleport. This is especially useful if your experience contains large spaces or if you are using a device with no positional tracking.

By default, the Teleport button is the joystick of the Oculus Touch controller. You first need to move the joystick in a direction for the Teleport Beam to show up. When teleporting is possible, the Beam is blue and you can see a circular target at the end of it. Simply release the joystick to teleport to that target.

To set up your zones of Teleport (and avoid to be able to teleport in every collider), you can create a Teleport Layer and select it in the Gaze_Teleporter Settings (see Details section below). To allow Teleport on a surface, your location’s collider has to be on the Teleport Layer.

The Teleport Settings are located in the Interactive Camera (Camera IO) rig, more precisely in Left Hand (IO) and Right Hand (IO). If you want, you can choose a specific set of Teleport parameters for each hand.

Camera (IO) -> Left Hand (IO) -> Gaze_Teleporter
Camera (IO) -> Right Hand (IO) -> Gaze_Teleporter

When the Teleport Beam is blue, you can teleport at destination. Color can be customized with the ‘Good Spot Col‘ property.

When the Teleport Beam is red, the teleport is not allowed. Color can be customized with the ‘Bad Spot Col‘ property.



Create your first Interactive Object (IO)

SpatialStories uses what we call ‘Interactive Objects’ shortened hereafter IO(s).

IOs are any objects you want to add life and interactions to. For instance, you can create a simple cube from within Unity and convert it to an IO or you can import your model in Unity from your favorite 3D software and convert it to an IO. You can also use 2D assets or anything you want. As soon as an object has been converted into an IO, you will be able to grab it, for example.

Converting an object into an IO

To convert your object into an IO, there are two options:

    • Right click on your object in the Hierarchy view and chose: SpatialStories -> Convert into an interactive object
    • Select your object in the hierarchy or in the scene view and in the menu bar select SpatialStories -> Convert -> Into an interactive object

IO’s Timeframe

An Interactive Object has several lifecycles also called timeframe described below :

Only during its ACTIVE phase, the conditions are checked and an interaction can occur.



Interactive Object (Details)



Add Interactions to your IO(s)

An Interaction is defined by two elements :

  1. A set of conditions
  2. A set of actions

When all conditions are met, the actions are triggered.

All interactions are defined in the Interactions node of the structure of an IO. Let’s see how it works !
Default Interaction
In the Hierarchy view, if you select the default interaction (the unique child of Interactions) you will see an interaction script attached to it in the inspector. Here you have two checkboxes : one for setting conditions called Conditions and one for actions called Actions.
Interaction script
The principle is simple : for any interaction, once its conditions are validated, the defined associated actions will take place (if you don’t define conditions but only actions, they will occur instantly).



Conditions



Actions



Drag and Drop

Note : With this first release, you have the ability to use the Drag and Drop functionnality. The process is quite simple but will be greatly simplified for the next releases of the SDK.

Drag and Drop is the ability to set a target place for an object to go to (in this example, the hat is dragged and dropped on the character’s head).

Set up Drag and Drop

Follow the steps below to setup a Drag and Drop interaction:

  1. Create an IO for your Drop object (see the Create your first Interactive Object (IO) section to know how to do it);
  2. Configure the Drop target where you drop the object (see explanations below);
  3. Configure the Drop object (see explanations below).



Audio

Activate: activate an audio on some condition.

    • Audio Source: the audio source used to play the audio
    • Immediate play: if enable, the audio will stop and start again when trigger
    • Cumulate audios: (only availabel if immediate play is disable): if enable, multiple audios can be played at the same time from the current interaction
      • Max concurrent audios: the maximum audio that can be played at the same time from the current interaction
    • Randomize pitch: if enable, the pitch is randomly chosen between minPitch and maxPitch at each time the audios is played

Audio Section

 

    • Loop: looping option
      • None: looping is disable.
      • Single: loop on a single audio
      • Playlist: loop on the full playlist (only useful when there are multiple audio specified).
        • Fade in between: if enable, the audios will fade in between each different audios in the playlist
    • Sequence: Only available when there are multiple audios in the playlist. Specify the sequence in which the audios are played.
      • In Order: the playlist is play in order specified in the UI
      • Random: the playlist is played randomly
    • Fade In: enable fade in. The fade duration can be specified. The fade curve can also be personnalised
    • Fade Out: enable fade out at the end of the audio. The fade duration can be specified. The fade curve can also be personnalised

Deactivate:  Stop All audio comming from audiosource.

    • Audio Source: the audio source that need to be stopped
    • Fade Out: enable fade out when deactivating an audioSource. The fade duration can be specified. The fade curve can also be personnalised

 

Fade