Software for NYU Future Reality Lab webXR-based XR experience.
install Node.js and npm if you haven't. Then in the command line, do
npm install
cd server
npm install
source patch
- At the root folder, do
./startserver
- Go to chrome://flags/ in your Google Chrome browser
- Search: "Insecure origins treated as secure" and enable the flag
- Add http://[your-computer's-ip-address]:2024 to the text box. For example http://10.19.127.1:2024
- Relaunch the chrome browser on your computer and go to http://localhost:2024
- Run the program locally on your computer
- Open the browser on your VR headset
- Go to chrome://flags/
- Search: "Insecure origins treated as secure" and enable the flag
- Add http://[your-computer's-ip-address]:2024 to the text box. For example http://10.19.127.1:2024
- Relaunch the browser on your VR headset and go to http://[your-computer's-ip-address]:2024
- On your Oculus app, go to Devices, select your headset from the device list, and wait for it to connect. Then select Developer Mode and turn on Developer Mode.
- Connect your quest with your computer using your Oculus Quest cable.
- Go to chrome://inspect#devices on your computer
- Go to your VR headset and accept Allow USB Debugging when prompted on the headset
- On the chrome://inspect#devices on your computer, you should be able to see your device under the Remote Target and its active programs. You can then inspect the NYU-FRL-XR window on your computer.
- Go to the scenes folder and create a .js file based on the template of demoSimplest.js
- Change the name and the content of the demo to whatever you like!
- Go to scenes.js, add the name of your demo and its path to the returned value of
scenes
- Note that the
enableSceneReloading
is set to true so that you can hot-reload the changes in your demo.
- Enable the experimental feature in the browser (Oculus Browser 11)
- Visit chrome://flags/
- Enable WebXR experiences with joint tracking (#webxr-hands)
- Enable WebXR Layers depth sorting (#webxr-depth-sorting)
- Enable WebXR Layers (#webxr-layers)
- Enable phase sync support (#webxr-phase-sync)
- Enable "Auto Enable Hands or Controllers" (Quest Settings (Gear Icon) -> Device -> Hands and Controllers)
- Enter the VR experience
- To change the initial position of your avatar: go to js/util/inline-viewer-helper.js and change the values of
this.lookYaw
,this.walkPosition
,this.lookPitch
. Notice thatthis.viewerHeight
(the avatar's height) is set to be 1.6m from theinlineViewerHelper.setHeight(1.6)
in js/immersive-pre.js. You can change this if you like. - To customize your own avatar: go to js/primitive/avatar.js. You can change the GLTF models used in the
Headset
and theController
classes. You can add additional arguments to theinitAvatar
function to specify the avatar's look, and pass those values from theaddPawn
function.
- Get the url you use to stream the camera feed (example: 'http://192.168.1.189/640x480.jpg'), and make sure the computer hosting the server is under the same LAN as the Arducam.
- Set up the environment by opening the terminal located in the folder and type
pip install -r requirements.txt
(orpip3 install -r requirements.txt
based on your version of pip) - Replace the url on line 19 of track.py (inside the urlopen() function) with your own url from the Arducam.
- After starting the server, open the terminal and locate to the current folder, type
python track.py
(orpython3 track.py
) - The default information from the camera contains the positional data and the context of the QR code inside the camera feed, if any, and the radius and positional data of the largest green object. To engineer your own CV method, consider the variable
frame
inside track.py, it is the pixel data of your current camera feed. After your own engineering, to make communication with the server regarding the new information, wrap the information into a JSON-formatted dictionary and send it by usingrequests.post('http://localhost:2024/your-own-function', json=your-data-dictionary)
. Then under server/main.js, open up a new app.route("/your-own-function").post(function (req, res), make sure to match the name with the name you defined in the python code, and receive the information byreq.body
. - If you're using a different computer for camera feed fetching and processing, make sure to change the url from http://localhost:2024/your-own-function to that of your own server (or the public server).