-
Notifications
You must be signed in to change notification settings - Fork 2
Home
Sandcastle was created with the intent of enabling easy content creation for the Immersive Web. It does not, and will never aim to, replace threejs or AFrame - rather, it's a set of useful components and opinionated design choices that lend themselves to quick multiapp WebXR bootstrapping with powerful features, and aims to enable creators to jump into creating immersive content on the web without having to worry about things like XR Input, event subscribing, transmitting WebRTC packets, manually implenting collision detection and so forth. Sandcastle is also inspired by the fearless, empowering creative spirit of OpenFrameworks and owes it and its creators a debt of gratitude.
Several design choices are made with the Webaverse Ecosystem in mind; for example, the renderer defaults to having a transparent background for easy AR-in-VR app creation by default. You can certainly use Sandcastle for any other purpose as well, but familiarizing your self with the Webaverse initiative is a good idea if you want to understand where Sandcastle comes from and the directions in which it intends to grow.
Sandcastle is firmly rooted in threejs and does not aim to offer to reinvent any wheels. Therefore if you're unfamiliar with threejs, understanding how that framework works will serve you greatly in onboarding on Sandcastle, too.
Additionally, if you have experience with a realtime 3D engine like Unity, UE4 or Godot you'll find that much of your thinking will translate well to Sandcastle. Indeed, it's created with those paradigms in mind and aims to mitigate the pains of onboarding to WebXR (and, indeed, Javascript) by providing familiar mental models as much as possible.
Sandcastle is very much a work in progress and while best efforts will be made to keep this wiki up to date, some information here may change over time.
As of the time of writing, WebXR is a promising new standard for the immersive web that isn't without its growing pains. While eventual mainstream browser adoption is all but guaranteed, the current state of the art presents a few pitfalls and gotchas around experiencing WebXR content on the web. (This problem isn't unique to Sandcastle or any other library for that matter.) It's important to understand that hardware type, operating system and browser choice can all affect the chances of success and finding a path forward can sometimes be nontrivial.
- As a good sanity check, before any WebXR work always verify you can run the WebXR Samples on your chosen platforms. These are the authoritative code samples of the spec - written by its authors - and if they don't work for you, chances are nothing else will, either.
-
As of the time of writing, Firefox only supports WebVR, the previous standard for spatial computing on the web, and will still seem to support many WebXR experiences through polyfills that "translate" the code to WebVR. This isn't a safe and future-proof practice and, as of the time of writing, you're probably better off using another browser.
-
Chrome is fairly advanced in its WebXR Support but not without hiccups. For example, lately support for the Oculus Runtime has been dropped in favor of the OpenXR Runtime which Oculus will roll out soon (but not now), and it is currently unclear if it's possible to run WebXR on Chrome via OpenVR (i.e SteamVR) as well. As a result, using WebXR on latest Chrome/Desktop/Oculus currently requires launching Chrome with specific flags.
-
Metachromium is a new spatial-first browser which enables not just WebXR but actual AR-in-VR (which Sandcastle defaults to enabling by providing transparent background in the renderer). Ecosystem-wise it's part of the Webaverse initiative. Iteration on Sandcastle projects with Metachromium is currently the most robust, hassle-free and effective means of working on desktop WebXR I know. You can download it here or joing the Exokit Discord to request a free Steam key.
-
Finally, Firefox and Chrome have a useful WebXR Browser Extension that provides the basic features of many of the popular headset currently on the market. Note that the implementation is partial and - in particular - gamepad API input data (controllerjoysticks and buttons) is nonexistent. While you'll get headset and controller position, rotation and some events, you won't be able to simulate the Gamepad API (buttons/joysticks) and probably get false negatives if you poll for those.
- Currently, WebXR works pretty much out of the box on the Oculus Browser and Firefox Reality. It may work well on other mobile XR browsers as well - if you're interested in any particular combo of hardware, environment and browser, you're advised to look up recent information about it as the state of WebXR is constantly in flux.
If you're familiar with threejs, you're basically already familiar with Sandcastle. Aside from simple, intuitive modularization, a sandcastle app is essentially a WebXR-ready threejs app with a few extra features.
Sandcastle's guts fundamentally live in src/components/engine/
. A good place to start grokking Sandcastle is ./src/components/engine/engine.js
. This is where the main render loop lives with updates every frame to the rendering, physics, and XR inputs, as well as any function called Update()
on any Scene object (note the capital U in Update(), to disambiguate your custom update methods from several update methods already built into ThreeJS). Engine also exports the Camera, which is the one you'll want to think of as your "main XR Camera". Note: despite known difficulties, Sandcastle lets you access and manipulate the camera's position and rotation directly just as you would any other object.
Modifying this engine.js
directly is inadvisable - you can create your own Update() methods wherever you want and they'll get called every frame.
Beyond that, things live where you'll expect them. Physics in physics.js
, XR Input in xrinput.js
, a state manager and custom event handler in state.js
. If you ever want to modify any inherent Sandcastle behavior (like rendering transparent backgrounds), the ./engine/
folder is the first place to look.
engine.js
also implements an "Engine Editor Camera" of the kind you'd expect in a game engine (for easy scene traversal when working on your app), and decides which scene is loaded into the app. It defaults to using ./src/scenes/defaultScene"
.
./src/components/engine/util/webxr/SessionHandler.js
is a slight modification of threeJS's "VR Button", that handles graceful entry and exit out of WebXR as well as custom State event dispatching - namely, xrsessionstarted/ended
and inputsourceschange
. This allows you to register listeners to those events anywhere in sandcastle simply by importing the State
singleton (see xrinput.js
to understand how the input component utilizes these events).
The main motivation in this component is providing out-of-the-box XR input functionality with an XRInput
singleton that you can subscribe to for events and poll for gamepad (joystick/button) data. You'll note that when State.debug
is true it logs all input data to the console.
The following events can currently be subscribed to:
XRInput.onSelectStart();
XRInput.onSelectEnd();
XRInput.onSelect();
XRInput.onSqueezeStart();
XRInput.onSqueezeEnd();
XRInput.onSqueeze();
XRInput.connected();
XRInput.disconnected();
See the WebXR Input Explainer for an overview of these events.
On inputsourceschange
(i.e when an XR session is initiated), XRInput.inputSources
is populated with the session's input sources. You have direct access to the controllers and their controllerGrips (see the threeJS docs to learn about the difference) via:
XRInput.leftController
XRInput.rightController
XRInput.leftControllerGrip
-
XRInput.rightControllerGrip
.
You can find an example of gamepad data polling in XRInput.js
's debugOutput
method. Again, it is recommended to work in your scene (or another component) and subscribe to methods / poll data from there, rather than in the engine's innards - see the pongxr
example for a reference.
Sandcastle ships with SCRaycaster
, a Raycasting helper, which abstracts away some of the nitty-grittiness of ThreeJS's Raycaster and accommodates common WebXR use case. Grab it by importing it from:
import { SCRaycaster } from "../engine/util/webxr/raycaster.ts";
and declaring an instance with the necessary params:
const _SCRaycaster = new SCRaycaster(
originObject - object to raycast from
target - object(s) to raycast to. Can be a mesh, a mesh array or a Box3.
direction - The normalized direction vector that gives direction to the ray. Defaults to new Vector3(0,0,-1).
isRecursive - If true, it also checks all descendants. Otherwise it only checks intersection with the object. Default is true.
near - All results returned are further away than near. Near can't be negative. Default value is 0.1.
far - All results returned are closer than far. Far can't be lower than near. Default value is 10.
)
You then typically use it by calling _SCRaycaster.getIntersections()
which will get the intersection of the origin object with the target object or array. Usually run within the update loop or as the result of an event. It will return a bool if intersects against a Box3, and an array if intersecting against one or many scene objects.
Finally, you may visualize the rays by calling _SCRaycaster.visualize(color = 0xffffff, onlyWhenHit = false)
. You may pass a different color as a hex value and a bool to determine whether the ray should be visualized only when a raycast hits the target, or always.
As mentioned, engine.js
creates an edior engine component to make your life easier. If you're familiar with Unity/UE4 controls, you already know the drill:
- Right click + WASDQE to move around the scene (Q/E move you up/down, respectively)
- Right click + shift + WASDQE move you twice as fast
- While moving, scroll wheel determines your movement speed
The camera position and rotation persist over refreshes and hot reloads, to simulate the "game engine feel" and comfort as much as possible.
You can maintain state however you want, but a global state
singleton is available for you to import from ./src/components/engine/state.js
. Out of the box, you can query the following:
// a globals object for convenience. See the `pongxr` for sample usage.
State.globals
// in a networked session, is this player's instance the master?
State.isMaster
// is an XR session currently in progress?
State.isXRSession
// an easy "pause" toggle to pause physics updates and/or anything else you'd want to implement
State.isPaused
// a reference to the current XR session
State.currentSession
// debug mode provides extensive logging for physics, networking and more, as well as the physics debug view. Defaults to toggling by shift+` (tilde)
State.debugMode
Additionally, state.js
contains a basic event creation and handling system:
-
state.eventHandler.registerEvent("EVENTNAME")
to register a new event calledEVENTNAME
-
state.eventHandler.dispatchEvent
to call the event -
state.eventHandler.addEventListener("EVENTNAME", callback)
to subscribe toEVENTNAME
with a callback function. -
The following events are provided and can be subscribed to from anyhwere:
// XR SESSION HANDLING 'xrsessionstarted' 'xrsessionended' // INPUT 'inputsourceschange' 'selectend' 'selectstart' 'select' 'squeezeend' 'squeezestart' 'squeeze' // NETWORKING 'peerconnected' 'peerdisconnected'
Sandcastle implements an abstraction layer around cannonJS
which lives in src/components/engine/physics.js
. It's currently an early work in progress - for fancier stuff you can always utilize/mod cannon directly. The main current call is:
Physics.addBody(mesh, rbShape, bodyType, mass = 1)
-
mesh
is a threejs mesh to compute and create a rigidbody for -
rbShape
is a faux-enum called Physics.RigidBodyType which can be: -
Physics.RigidBodyType.Box
-
Physics.RigidBodyType.Sphere
-
Physics.RigidBodyType.Plane
-
Physics.RigidBodyType.Cylinder
- bodyType is the Physics.Body faux-enum. It can be Body.DYNAMIC (a simulated object that moves around), Body.STATIC (a simulated object that stays in place) and Body.KINEMATIC (an externally controlled object such as a controller or programatically moving platform, etc). See the CannonJS docs for further information.
-
mass
is the desired object mass, which defaults to 1. - Rigidbody dimensions will be autocomputed based on the mesh's bounding box.
Note that for shared (networked) objects, you'll want to aim for a single physics source of truth - essentially emulate the server/client model. Knowing which player is master
is helpful here. See the pongxr
example and particularly ball.js
for an example of conditional rigidbody addition.
The Physics component ships with a useful physics debugger, to enable you to see the RigidBodies in your scene. To use it, call Physics.enableDebugger(scene)
(passing your app's scene). Remember you can switch to debug mode (which includes this debugger) using Shift+`
at any time. (You can change this binding in state.js
).
The physics implementation is still very much a work in progress and contributions are welcome.
Sandcastle implements Takahiro's ThreeNetwork, an easy-to-use networking library for threejs. It is not intended for production or anything beyond simple shared user experiences, as the tradeoff is robust scalability and sound infrastructure for out-of-the-box ease-of-use. In its current implementation it relies on WebRTC via a dedicated Firebase-based signaling server, and enables the instant networking of ThreeJS objects. The API relies fundamentally on addLocalObject()
, addRemoteObject()
and addSharedObject()
- see Takahiro's original repo for more information, and the pongxr
example for a simple implementation, as well as voicestreaming
for a very simple example of streaming audio.
As a general conecept you'll want to think of shared objects
and local/remote
objects. The pongxr
example provides instances of both; the "placement cube" is a shared object while the controller-associated paddles are local/remote since they are user-specific. The ball could have been shared except for the conditional rigidbody logic as mentioned in the physics section, and thus it is local/remote.
import { PeerConnection } from '../../engine/networking/PeerConnection'
...
const scene = new Scene();
const networking = new PeerConnection(scene);
Then, when you wish to add a shared object - for example, MyObject
:
networking.remoteSync.addSharedObject(MyObject);
// don't forget to also call
scene.add(MyObject);
note that due to the nature of the networking, networked objects can't be created until an initial connection has been made to the signaling server. Therefore you'll want to listen for the open
event to initialize any shared component generation, i.e:
networking.remoteSync.addEventListener("open", (e) =>
{
initPlacement();
});
You can look at src/components/engine/networking/PeerConnection.js
to see what happens under the hood. For production-grade apps it's highly recommend you substitute the given Firebase credentials which are shared among all who clone Sandcastle.
on launch, a link to the shared experience appears at the bottom of the page. Anyone who opens the link has access to the same experience, with any components that have been called addSharedObject()
or addLocal/RemoteObject()
being shared.
Note that you get the networked experience out of the box simply by creating a new PeerConnection object and passing your scene as an argument:
const networking = new PeerConnection(scene);
(of course, you can call it whatever you want).
You can then enter the website from any other tab or browser using the same 4 digit code suffixed to your uri to enter the shared experience. Easy as pie! (See below for testing networked experiences while developing on localHost.)
See Takahiro's original repo for more information and examples of sharedObject
and localObject
, and the src/scenes/pongxr/scene.js
for a simple example of using networked controllers with a networked ball.
further API clarification, docs: still very much a work in progress.
If you wish to test a networked WebXR app, you're going to need to serve it over https since that is part of the spec's requirements (navigator.xr
simply won't exist without it). The quickest path to get up and running is to install ngrok, a tunneling service (with a usable free tier) that can expose a locally served website to the external world over http as well as https. (Note that this poses several security risks and you should only use this for testing.)
Simply run npm i -g ngrok
to install ngrok globally, after which you'll be able to run the command ngrok http 1234
to open a tunnel to port 1234, which is the local webpack devserver's port (make sure it is running, of course!). You'll see output along the lines of:
ngrok by @inconshreveable (Ctrl+C to quit)
Session Status online
Session Expires 7 hours, 59 minutes
Version 2.3.35
Region United States (us)
Web Interface http://127.0.0.1:4040
Forwarding http://15511a1da059.ngrok.io -> http://localhost:1234
Forwarding https://15511a1da059.ngrok.io -> http://localhost:1234
Connections ttl opn rt1 rt5 p50 p90
0 0 0.00 0.00 0.00 0.00
you can then test your work using the https
link (in this case, https://15511a1da059.ngrok.io
) from anywhere, having others join networked sessions, etc.
Sandcastle aims to simplify the asset integration pipeline. It automatically compresses image textures and will eventually further enable usage of compression standards like Draco and BASIS.
Your assets are automatically moved to relevant folders like models
and images
. During development. Most importantly, use requires
wherever you need to reference an asset, for example:
const cloud1 = require("./assets/textures/1.jpg");
The webpack build process will take care of the rest come build time.
GLSL shaders deserve a special mention: sandcastle can digest them easily without fancy string concats or other older workarounds. Simply require
or import
your .glsl
, .vs
or .fs
file (make sure that the suffix is one of the above) and use it however you want.
This is a project in its very early stages. You're highly encouraged to experiment with it, take it for a spin, build some cool WebXR with it and help make it better for everyone. Please submit issues for anything you come across that is less than ideal. A current, partial list of TODOs:
- XR Package Integration
- Expose more of CannonJS in the Physics API
- Relocate physics to a web worker
- More examples
- Video tutorials
- More thorough documentation with implementation examples