Using machine learning to power intuitive human computer interfaces.
Hand Gesture Interface requires Python 3 to be installed and a working webcam. Currently, only MacOS is supported but support for Windows and Linux is coming soon.
Install the requirements specified in requirements.txt
and then run python main.py
.
Once the run script displays "Started inference loop", you can begin using hand gestures.
So far, only 12 gestures are supported.
Four of these are swiping with your whole hand left or right (for switching desktops) and up or down. You may also swipe up/down/left/right with two or one finger for scrolling.
Zooming in and our with full hand will initiate a zoom in/zoom out on the application you're currently running.
Pushing hand in or away will either reveal desktop or show your installed apps.
Contributions are welcome! Feel free to fork/pull request any changes.