HoloCapture2.mov
Our motivation behind this project is rooted in the applicability of mixed reality scenarios to benefit productivity, entertainment, connectivity, and beyond. Mixed reality has untapped potential, and our team would love to contribute to helping push this technology forward. Our work will provide us with hands-on experience for collaborative mixed reality scenarios, and in turn, the research community with valuable datasets. We are excited to work with this technology and help the research community.
- Enable a collaborative MR environment with two HoloLens devices
- Mimic typical collaborative scenarios for mixed reality in a local environment
- Design experiments over the collaborative mixed reality application to best collect a robust dataset
- Research and implement system setup for collaborative MR
- Identify the type of application for experiments
- Design experiments for two subjects wearing headsets and interacting with a common scene
- Collect and organize the data
- Windows Laptop with Universal Windows Platform support
- Windows 10 [OS build 19045.3693]
- Setup of Unity Editor on a Windows laptop
- Unity Editor [2020.3.48f1] Ensure the following packages are installed: - Azure Spatial Anchors SDK Core [2.12.0] - Azure Spatial Anchors SDK for Windows [2.12.0] - Mixed Reality OpenXR Plugin [1.8.1] - Mixed Reality Toolkit Examples [2.7.0] - Mixed Reality Toolkit Extensions [2.7.0] - Mixed Reality Toolkit Foundation [2.7.0] - Mixed Reality Toolkit Standard Assets [2.7.0] - Mixed Reality Toolkit Tools [2.7.0]
- Visual Studio with Required Components
- Visual Studio [16.11.31]
- ASP.NET Web Development Tools Version [4.7.2]
- Universal Windows Platform Development Component
- Desktop Development with C++ Component
- Game Development with Unity Toolset
- Setup of Hololens with Developer mode for accessing Windows Device Portal
- Setup of Hololens with Research mode for collecting sensor output
While building the application, incorrect setup of the MRTK and OpenXR profiles, and Project settings was the main cause for non-functional features. The following changes in these settings should be known to the developer. To find these settings in the Unity editor, click File → Build Settings → Player Settings
- OpenXR Settings and Profiles:
- The following Feature Groups should be enabled in XR Plug-in Management → OpenXR:
- Hand Tracking: Enables the tracking of hand movements and gestures, allowing for natural interaction within virtual environments.
- Hand Interaction Poses: Provides predefined hand poses for common interactions, simplifying the development of hand-based controls.
- Motion Controller Model: Integrates physical controller models into the virtual environment, enhancing the realism and interactivity of user inputs.
- Ensure that “Runtime Debugger” is not selected in the features page. This debugger encountered memory access exceptions during runtime and resulted in the crashing of our application.
- The following Feature Groups should be enabled in XR Plug-in Management → OpenXR:
- The following Interaction Profiles should be added:
- Eye Gaze Interaction Profile: Facilitates eye tracking for interaction, allowing applications to respond to where the user is looking.
- Hand Interaction Profile: Defines standard interactions for hand tracking, ensuring consistent and intuitive user experiences.
- Microsoft Hand Interaction Profile: A specialized profile tailored for Microsoft devices, optimizing hand tracking and interactions in the context of Microsoft's ecosystem and hardware.
- Project Settings:
- The following Capabilities should be enabled in Player → Publishing Settings:
- InternetClient: Allows the app to access the internet
- InternetClientServer: Enables network communication capabilities
- PrivateNetworkClientServer: Permits communication on private networks
- WebCam: Needed for accessing the device's webcam, crucial for AR experiences
- Microphone: Enables voice input and audio recording functionalities
- Spatial Perception: Allows the app to understand and interact with the physical space around the user
- GazeInput: Enables eye tracking, allowing users to interact with the app using their gaze
- The following Capabilities should be enabled in Player → Publishing Settings:
- MRTK Profiles:
- Our application utilized an imported MixedRealityToolkit profile, along with some further configuration. The profile is from the Mixed Reality Toolkit Foundation, and is called “DefaultMixedRealityToolkitConfigurationProfile.” The following changes should be made for adapting this configuration profile to our use case. - Input → Pointers → Is Eye Tracking Enabled (make sure this box is checked) - Spatial Awareness → OpenXR Spatial Mesh Observer → Display Settings → Display Option → Set to Occlusion (This prevents the spatial mesh from being constantly displayed)
These settings and features will enable and include the necessary systems in the application build. Developers must ensure the correct build and deployment steps of their application to ensure its compatibility with the Hololens 2 platform.
-
Unity Editor: File --> Build Settings
-
Visual Studio:
- Open Project .sln file
- Right Click Solution in Solution Explorer --> Properties
- Begin Deployment to Hololens with Green Arrow
- PhotonRoomWordPuzzle.cs:
- Script built upon the PhotonRoom.cs file included from the MRTK.Tutorials.MultiUserCapabilities asset.
- Receives x prefab elements for instantiation (15 for our application)
- Receives x anchor locations for object spawn locations
- Facilitates creation of shared virtual environment and objects
- Handles networking events
- Attached to the NetworkRoom gameObject
- Script built upon the PhotonRoom.cs file included from the MRTK.Tutorials.MultiUserCapabilities asset.
- CollectData.cs
- Script for logging user interaction and object positions to a .txt file
- Recieves print delay (in seconds)
- Ensures input systems are configured for data collection
- Timed data: collected every 100ms
- Poster positions: recorded once the objects are initializes (takes ~2 seconds to initialize)
- Head position: the global position of the users head in space (Provides data instantly)
- Hand Joint data: Position and rotation of each joint in the hand
- Provides data instantly (given hands can be seen)
- Eye gaze origin and direction : returns hitobject if eye gaze intersects a gameObject (Takes ~10 seconds to initialize)
- Event data: collected for touch events (max 1 every 100ms)
- Poster touchpoints: global position of index finger when it touches a poster (max 1 event per 100 ms)
- Provides data instantly
- Script for logging user interaction and object positions to a .txt file
This section describes the main project hierarchy for the HoloLens Collaborative Mixed Reality application.
The network lobby object is used for managing the lobby space in the networked application.
This object is used for setting up networked rooms within the application.
The shared playground object is a key component for collaborative interactions.
This object acts as the parent for anchor points in the project.
The table object serves as an anchor point for a poster in the application.
The poster is instantiated at the set anchor locations. The attached scripts provide networking and interactability.
We provided two custom scripts. All other scripts in the images can be obtained through the listed packages.
- John Dale: Setup, Unity development, Research, Software
- Dani Kasti: Experiment design, Research, Writing
- Shared Experiences in Mixed Reality
- Research Mode in Mixed Reality
- Mixed Reality Toolkit (MRTK)
- YouTube Video - Development Tutorial
- YouTube Video - Unity Tutorial
- MixedRealityToolkit-Utilities
- MixedRealityToolkit-TrackedHandJoint
- MixedRealityToolkit-Pose
- MixedRealityToolkit-eyeGaze
- Unity3D
- SteamWriter
- Hololens2-Tutorial
- MRTK3
- Photon-Realtime
- Photon-Pun
- Visual Studio with Hololens 2
- Windows Device Portal