A WebXR starter template using three.js and Spark for web / VR experiences using gaussian splats as the background.
You can see a hosted version here
- WebXR support with hand tracking
- Gaussian Splat rendering via Spark with LoD to support truly massive scenes
- Stereoscopic audio and spatial audio triggers
- HUD overlay with position/FPS display
- Mesh object/character insertion with lighting placement
- Physics and Collisions via Rapier
- Dynamic object support, kick and throw objects using keyboard or VR hand tracking
# Install dependencies
npm install
# Start development server
npm run dev
# Build for production
npm run build├── index.html # Main HTML entry point
├── main.js # Main file.
├── scene.js # Scene creation and animation loop
├── audio.js # Background audio management
├── spatial-audio.js # 3D positional audio system
├── lighting.js # Lighting system and configuration
├── objects.js # Mesh object loading from config
├── collisions.js # Rapier physics world and collision mesh handling
├── object-actions.js # Object interactions (kick, throw)
├── throw-hand.js # VR hand tracking for grabbing/throwing objects
├── robot.js # Robot/drone mesh loading and waypoint navigation
├── hud.js # HUD overlay display for debugging
├── progress.js # Loading progress overlay
├── sdf-hand.js # SDF hand tracking visualization
├── path.js # Path markers with SDF ground highlights and toggle UI
├── assets.js # Asset URL resolution (local/CDN fallback)
├── config.js # Configuration parameters
├── vite.config.js # Vite configuration
├── netlify.toml # Netlify deployment configuration
├── scenes/ # Scene-specific configs (per scene)
│ └── <scene-name>/
│ ├── config.js # Scene settings and flags
│ ├── audio-config.json # Spatial audio sources for the scene
│ ├── lighting-config.json# Lighting setup for the scene
│ ├── objects-config.json # Dynamic/static objects for the scene
│ ├── robot-config.json # Robot/waypoint config (optional)
│ └── path-config.json # Path waypoints & highlight offsets (optional)
└── public/
└── scenes/ # Scene-local assets served statically
└── <scene-name>/
├── assets/ # Scene-local binaries/textures/audio
├── *.mp3 # Scene audio
├── *.glb / *.fbx # Scene meshes
└── *.spz # Scene splat files
Assets are loaded using a fallback system:
- First checks
public/assets/for local files - Falls back to CDN if not found locally. Currently it's configured to use my public bucket on Tigris.
This allows for local development with custom assets while using hosted assets in production.
The spatial audio system places 3D positional audio sources in the scene. Audio sources can be continues or triggered once when the user is nearby. Configure sources in public/assets/audio-config.json:
[
{
"audio_url": "my-sound.mp3",
"audio_position": [1.0, 2.0, -3.0],
"falloff": {
"refDistance": 5,
"rolloffFactor": 1,
"maxDistance": 50,
"volume": 0.8,
"loop": true
}
},
{
"audio_url": "trigger-sound.mp3",
"audio_position": [0, 1, 0],
"triggerRadius": 2,
"falloff": {
"loop": false
}
}
]| Property | Description | Default |
|---|---|---|
audio_url |
Path to audio file (resolved via asset system) | required |
audio_position |
[x, y, z] position in scene | required |
falloff.refDistance |
Distance at which volume is 100% | 5 |
falloff.rolloffFactor |
How quickly sound fades with distance | 1 |
falloff.maxDistance |
Maximum audible distance | 50 |
falloff.volume |
Base volume (0-1) | 1 |
falloff.loop |
Whether audio loops | true |
triggerRadius |
Proximity radius to trigger non-looping audio | null |
- Looping sources play automatically when audio is enabled
- Triggered sources (with
triggerRadius) play once when the user enters the radius - Debug visualization: Enable HUD to see red wireframe spheres at audio source locations
- Audio sources respond to the global audio toggle (on/off)
Load mesh objects (FBX, GLTF, GLB) from a config file public/assets/objects-config.json:
[
{
"name": "soccerball",
"model": "soccer_ball/soccer_ball.glb",
"position": [-2.5, 8, -6],
"scale": 0.3
}
]All objects get turned into dynamic objects and can be kicked or thrown.
The code includes a floating drone as an example of mesh integration. The drone's waypoints are loaded from the config file
public/assets/robot-config.json.
The lighting system places threejs lights in the scene from the following config file public/assets/lighting-config.json:
[
{
"name": "ambient",
"type": "ambient",
"color": "#ffffff",
"intensity": 0.5
},
{
"name": "sun",
"type": "directional",
"color": "#ffffff",
"intensity": 1.0,
"position": [5, 10, 5],
"target": [0, 0, 0],
"castShadow": true,
"shadowMapSize": 1024
},
{
"name": "lamp",
"type": "point",
"color": "#ffffaa",
"intensity": 2.0,
"position": [-7, 7, -10],
"distance": 0,
"decay": 2
}
]| Type | Description | Required Properties | Optional Properties |
|---|---|---|---|
ambient |
Overall scene illumination (no position) | color, intensity |
- |
directional |
Sun-like parallel rays | color, intensity |
position, target, castShadow, shadowMapSize |
point |
Light bulb, radiates in all directions | color, intensity, position |
distance, decay, castShadow |
spot |
Focused cone of light | color, intensity, position |
target, distance, angle, penumbra, decay, castShadow |
hemisphere |
Sky/ground gradient lighting | color, intensity |
groundColor, position |
The project uses Rapier for physics simulation.
All objects from objects-config.json automatically become dynamic physics objects with:
- Sphere colliders based on bounding box
- Configurable mass, restitution (bounciness), and friction
- Real-time physics simulation (gravity, bouncing, rolling)
Keyboard Controls:
| Key | Action |
|---|---|
K |
Kick - Push nearby objects away from the camera with a low trajectory |
T |
Throw - Launch nearby objects in the direction you're looking with a high arc |
Both actions only affect objects within 5 meters of the camera. Force scales with distance (closer = stronger).
VR Hand Tracking:
- Pinch gesture (thumb + index finger) near an object to grab it
- Move your hand while pinching to carry the object
- Release the pinch to throw - velocity is calculated from hand movement
This project is easily deployable to any static hosting site (e.g. Netlify). I prefer to deploy directly rather than going through git. So the workflow I use is as follows:
-
Install Netlify CLI (if you haven't already):
npm install -g netlify-cli
-
Login to Netlify:
netlify login
This will open your browser to authenticate with your Netlify account.
-
Deploy to production:
netlify deploy --prod
On your first deployment, Netlify will ask if you want to create a new site or link to an existing site:
- Create a new site: Choose this if you're deploying for the first time
-
That's it! After the deployment completes, you'll see your live URL (e.g.,
https://your-site-name.netlify.app).
The netlify.toml file in this project is already configured with the correct build settings, so Netlify will automatically:
- Run
npm run build - Deploy the
distfolder - Set up proper redirects for single-page applications
Note: Each time you make changes and want to update your live site, just run netlify deploy --prod again from your project directory.
- Three.js - 3D rendering
- Spark - Gaussian Splat rendering and WebXR utilities
- Rapier - Physics engine (via @dimforge/rapier3d-compat)
- Vite - Build tooling
- Lucide - UI icons
Based on code written by Winnie Lin
MIT
