This repository contains my final project for the Game Development bachelor's degree at the British Columbia Institute of Technology.
It was developed by me from January 2020 - October 2020.
The project was an application for streaming with an avatar with the following tracking hardware:
- SteamVR devices/HTC Vive Trackers
- Webcam
- Tobii EyeTracker
- Leap Motion
The project had two main parts:
- VirtualStream application
- VirtualStream SDK (For custom avatars, environments, props)
Assets used:
- SteamVR Plugin
- FinalIK (Paid asset, not included in this repository)
- OpenCV for Unity (Paid asset, not included in this repository)
- Dlib Face Landmark Detector (Paid asset, not included in this repository)
- Tobii SDK
- Leap Motion SDK
- Unity Standalone File Browser
- Unity3DRuntimeTransformGizmo
Webcam tracking was done by using OpenCV with the Faster RCNN Inception V2 model trained using the Tensorflow Object Detection API with:
The following were used for reference and development:
- TensorFlow Model Detection Zoo
- Face Detection with OpenCV and deep learning
- Facial landmarks with dlib, OpenCV, and Python
- Head Pose Estimation using OpenCV and Dlib
- Handy, hand detection with OpenCV
- Eye blink detection with OpenCV, Python, and dlib
- OpenCV rotation (Rodrigues) and translation vectors for position 3D object in Unity3D
- Real-time Hand-Detection using Neural Networks (SSD) on Tensorflow
- EgoHands: A Dataset for Hands in Complex Egocentric Interactions
- How to train a Tensorflow face object detection model
- WIDER FACE: A Face Detection Benchmark


