-
Notifications
You must be signed in to change notification settings - Fork 1
Adding Redirection to a Scene
This page describes how to set up a redirection illusion in Unity using the prefabs we provided and set it up to work with different VR SDKs.
Go to the prefabs folder of the Toolkit (/Assets/Visuo-haptic Toolkit/Prefabs). From the Redirection folder, drag the suitable redirection prefab onto your scene. Import the prefab from your VR provider. In our case, we will use the OVRCameraRig from Meta XR with hand tracking. You should get a hierarchy like the following.
From there, you need to drag and drop the user limb data (always necessary even if you are using a world redirection technique), select a redirection technique and a numerical parameters object (either the provided one or your own).
You should see more variables being shown below the parameters. They are the relevant object information (for example, physical and virtual targets for hand redirection) and the technique parameters for the selected technique. Here is an example using the "Toolkit Body Redirection" prefab and the Meta XR OVRCameraRig
.
In this state, the user's virtual hand is redirected linearly ([Han et al., 2018], see Hand Redirection) from the origin to the virtual target as the user's physical hand reaches for the physical target.
This is useful if you want the user to use both hands or if you have more information than the hand position such as wrist or elbow position... Simply add a physical limb row in the "User Limbs" variable of the main component, assign the tracked object to it and set the avatar accordingly in the virtual limb array.
We cover two SDKs here: Meta XR and OpenXR. You should be able to adapt what we describe here to other SDKs.
We set all our examples using the hand tracking functionality described here. For hand redirection, to properly redirect the user's hands and have both visual redirection and collisions, you need to assign two GameObjects for each physical hand.
- The physical hand is tracked by the
OVRCameraRig/TrackingSpace/RightHandAnchor
, - The visual hand location is set by
OVRCameraRig/OVRInteraction/OVRHands/(Left|Right)Hand/HandVisuals(Left|Right)/OVR(Left|Right)HandVisual/OculusHand_(L|R)
, - The collisions use
OVRCameraRig/TrackingSpace/(Left|Right)HandAnchor/OVRHandPrefab
.
For world redirection, setting up the hands is essential but you also need to set the physical and virtual heads. The physical head is OVRCameraRig/TrackingSpace/CenterEyeAnchor
and the virtual head can be any regular Unity camera. Note that you cannot disable the camera component attached to CenterEyeAnchor since the SDK will reenable it on Play Mode. A quick hack consists in changing the display output of this camera, for example from 1 to 8.
TODO
Authors: Benoît Geslain, Bruno Jartoux