Skip to content
/ VRI Public
forked from dezelyu/VRI

VRInteraction – A Unity 6 OpenXR toolkit for VR input, haptic feedback, and real-time controller data streaming to the Editor.

Notifications You must be signed in to change notification settings

timojohn77/VRI

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Overview

VRInteraction is a minimal toolkit for Unity 6 that handles VR input processing and haptic feedback. It also enables real-time input synchronization between a standalone VR headset and the Unity Editor. This allows developers to test and debug VR interactions directly in the editor, without building the project or relying on platform-specific solutions. This project was developed by Deze Lyu.

License

This project is licensed under the Creative Commons Attribution 4.0 International License (CC BY 4.0). It allows sharing, modification, and commercial use, provided that clear and appropriate credit is given to the original author, Deze Lyu. Redistribution or derivative works must not misrepresent the original source or imply authorship by anyone other than the original creator.

Quick Setup

Open the project using Unity 6. This is a Built-in Render Pipeline project with a minimal setup, designed for fast building and deployment to standalone VR headsets. To test the project:

  1. Open the Build Profile window by going to File > Build Profile in the top menu.
  2. In the Build Profile window, select your target platform, such as Meta Quest.
  3. Click Switch Platform to set the selected platform as active.
  4. This usually activates the Android platform. In the Build Profiles window, click on the currently active platform to view its settings.
  5. Under the Run Device section, select the connected headset device from the list.
  6. Click Build and Run to deploy the project to the headset.
  7. Once launched in the VR headset, you should see:
    • A floating box in front of your view, representing the headset’s orientation.
    • Two gun-shaped meshes attached to your controllers, indicating their position and rotation.

To test input synchronization between the headset and the Unity Editor:

  1. In the Unity Hierarchy, select the VRI GameObject.
  2. In the Inspector, locate the VRI (Script) component.
  3. Under the Mode dropdown, you'll find three available options:
    • Default: Uses the headset and controller input directly from the local device. This is the mode to use for in-headset testing and final builds.
    • Headset: Streams real-time tracking data (headset and controller positions, rotations, and input states) to the Unity Editor, and receives haptic feedback commands from the Editor.
    • Editor: Receives tracking data from the headset and controllers, and streams haptic feedback commands back to the VR headset.
  4. Select Headset mode in the VRI component. This sets the headset to act as the data sender.
  5. Ensure both your VR headset and the machine running the Unity Editor are connected to the same local network.
  6. Find the IP addresses of both devices and fill in the fields accordingly:
    • Address H: The IP address of the VR headset.
    • Address E: The IP address of the machine running the Unity Editor.
    • Port H and Port E: Specify two open UDP ports for communication.
  7. Build and deploy the modified project to your VR headset. The experience should look identical, but the system is now:
    • Streaming real-time headset and controller data (position, rotation, input values) to the Editor.
    • Listening for haptic feedback commands from the Editor, which will trigger vibrations on the controllers.
  8. After confirming the updated project builds and runs correctly on the headset, leave the headset project running throughout the session. You may need to disable the proximity sensor to prevent the headset from automatically going to sleep.
  9. In the Unity Editor, change the Mode in the VRI component to Editor. Leave all other fields (IP addresses and ports) unchanged.
  10. Click Play in the Unity Editor. If everything is set up correctly, moving the headset and controllers should result in corresponding movement within the editor scene.
    Congratulations — the system is working!

If you encounter issues:

  • Ensure Unity Editor has network permissions (check the system firewall settings).
  • Double-check that both the headset and the Editor machine are connected to the same local network.
  • Verify that the IP addresses and ports are entered correctly and consistently on both ends.

Integrating VRI

You can either use VRI as your primary interaction module—accessing input data and applying haptic feedback directly—or create a wrapper that bridges VRI with your own or third-party interaction logic. Integration is simple:

  1. Import the entire VRI folder into your project without altering its internal structure.
  2. In a scene with an XR Origin already set up, add an empty GameObject and attach the VRI script.
  3. To visualize controller input in the scene, create two separate GameObjects under XR Origin (VR) > Camera Offset to represent the left and right controllers. Then, assign these objects to the Controller L and Controller R fields of the VRI component.
  4. Similarly, create a separate Headset GameObject under Camera Offset, and assign it to the Headset field in the VRI component. Do not assign the XR camera itself. While this node will always follow the position and rotation of the primary XR camera, having a dedicated transform allows for easier referencing, debugging, and attaching objects to the player's view in the scene.

This configuration allows the camera, headset, and controller nodes in your scene to move in real time based on the physical movements and inputs from the VR headset and controllers.

In addition to syncing transforms, the VRI class provides direct access to various input values through static properties:

  • VRI.PositionH / VRI.RotationH — Headset position and rotation
  • VRI.PositionL / VRI.RotationL — Left controller position and rotation
  • VRI.PositionR / VRI.RotationR — Right controller position and rotation
  • VRI.StickL / VRI.StickR — Left and right thumbstick (joystick) input as Vector2
  • VRI.TriggerL / VRI.TriggerR — Left and right trigger pressure (float, 0.0f to 1.0f)
  • VRI.GripL / VRI.GripR — Left and right grip pressure (float, 0.0f to 1.0f)

Note: VRI does not handle platform-specific inputs like primary/secondary buttons, which can vary across devices. To add support for such inputs, create new input actions and bindings in VRI/Resources/VRI.inputactions, and update the VRI/VRI.cs script by referencing how existing inputs are handled and streamed.

In addition to input processing, VRI supports simulating haptic impulse (controller vibration) feedback. To trigger vibration on the controllers, simply call:

  • VRI.FeedbackL(float Intensity, float Duration) for the left controller
  • VRI.FeedbackR(float Intensity, float Duration) for the right controller

The intensity parameter should be between 0.0f and 1.0f, and duration specifies how long the vibration should last (in seconds).

Note: For continuous or long-duration feedback, either call the function repeatedly with short intervals and shorter durations, or call it once with a long duration and later override it by calling the same method with an intensity of 0.0f to stop the vibration.

VRI behaves consistently across all modes. In both Default and Editor modes, it provides the same spatial and input data from the headset and controllers, and applies haptic feedback in the same manner. This ensures a unified interaction experience regardless of the deployment or testing environment.

Example

To demonstrate how to read input data and apply haptic feedback using VRI, you can follow this quick example:

  1. Create a new C# script in your project and name it Example.cs.
  2. Copy and paste the following code into the script file:
using UnityEngine;
public class Example : MonoBehaviour {
    
    // declare the indicators for whether the left and right triggers have been activated
    private bool TriggerL = false;
    private bool TriggerR = false;
    
    private void Update() {
        
        // reset the indicator when the left trigger was activated but has been fully released
        if (this.TriggerL) {
            if (VRI.TriggerL == 0.0f) {
                this.TriggerL = false;
            }
            
            // apply a haptic impulse to the left controller upon new trigger activation
        } else {
            if (VRI.TriggerL > 0.5f) {
                this.TriggerL = true;
                VRI.FeedbackL(1.0f, 1.0f);
            }
        }
        
        // reset the indicator when the right trigger was activated but has been fully released
        if (this.TriggerR) {
            if (VRI.TriggerR == 0.0f) {
                this.TriggerR = false;
            }
            
            // apply a haptic impulse to the right controller upon new trigger activation
        } else {
            if (VRI.TriggerR > 0.5f) {
                this.TriggerR = true;
                VRI.FeedbackR(1.0f, 1.0f);
            }
        }
        
        // check if both grips are held down and compare the heights of the controllers
        if (VRI.GripL > 0.5f && VRI.GripR > 0.5f) {
            if (VRI.PositionL.y > VRI.PositionR.y) {
                
                // apply a haptic impulse to the left controller
                VRI.FeedbackL(1.0f, 0.0f);
            } else {
                
                // apply a haptic impulse to the right controller
                VRI.FeedbackR(1.0f, 0.0f);
            }
        }
    }
}
  1. In a scene that includes an XR Origin and the VRI component, create an empty GameObject and attach the Example script to it. You can also use the sample scene provided in the project to test this example.
  2. When the project is deployed to your headset with the VRI component’s Mode set to Default, pressing a controller’s trigger will cause it to vibrate briefly for one second. If both grips are held simultaneously, the controller positioned higher will continuously vibrate at full intensity.
  3. To test the example in the Unity Editor, ensure that a VRI instance is already running on your headset, configured as described in the Quick Setup section. You should not build and deploy the project to the headset again with the example script included and VRI set to Headset mode, as this would cause both the headset and the Unity Editor to execute the same logic simultaneously, making it unclear where input or haptic feedback originated. The headset should run VRI in Headset mode only, acting as a clean server without any additional logic.
  4. Finally, switch the VRI component’s Mode to Editor and press the Play button in Unity. If everything is configured correctly, pulling a trigger will generate vibration on the corresponding controller, and when both grips are held, the higher controller will vibrate continuously—demonstrating consistent behavior.

About

VRInteraction – A Unity 6 OpenXR toolkit for VR input, haptic feedback, and real-time controller data streaming to the Editor.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C# 100.0%