Skip to content

MunifiSense/VirtualStream

Repository files navigation

VirtualStream

This repository contains my final project for the Game Development bachelor's degree at the British Columbia Institute of Technology.
It was developed by me from January 2020 - October 2020.

The project was an application for streaming with an avatar with the following tracking hardware:

  • SteamVR devices/HTC Vive Trackers
  • Webcam
  • Tobii EyeTracker
  • Leap Motion

The project had two main parts:

  • VirtualStream application
  • VirtualStream SDK (For custom avatars, environments, props)

Assets used:

Webcam tracking was done by using OpenCV with the Faster RCNN Inception V2 model trained using the Tensorflow Object Detection API with:

Main Menu

Webcam Tracking with Debug Info

VirtualStream SDK Avatar Exporter

References

The following were used for reference and development:

  1. TensorFlow Model Detection Zoo
  2. Face Detection with OpenCV and deep learning
  3. Facial landmarks with dlib, OpenCV, and Python
  4. Head Pose Estimation using OpenCV and Dlib
  5. Handy, hand detection with OpenCV
  6. Eye blink detection with OpenCV, Python, and dlib
  7. OpenCV rotation (Rodrigues) and translation vectors for position 3D object in Unity3D
  8. Real-time Hand-Detection using Neural Networks (SSD) on Tensorflow
  9. EgoHands: A Dataset for Hands in Complex Egocentric Interactions
  10. How to train a Tensorflow face object detection model
  11. WIDER FACE: A Face Detection Benchmark

About

An application for streaming with an avatar with VR hardware support.

Resources

License

Stars

Watchers

Forks

Packages

No packages published