Thank you for your interest and support for the "Hot Key With Hands Recognition" project. After careful consideration, I have decided to mark this project as deprecated. As technology advances, there are now more mature and feature-rich solutions available in the market that better meet your needs.
Please note that this project will no longer receive updates or maintenance. I encourage everyone to explore other more advanced alternatives for a better experience and support. Thank you once again for your past support and understanding.
🌐 Supported Languages: English | 中文
This gesture recognition-based software enables you to effortlessly operate and browse various applications on your Windows computer without the need for a keyboard or mouse, such as effortlessly swiping through TikTok.Whether you're enjoying a meal, have messy hands, or are just a bit far from your computer, you can rely on gesture recognition to control the app and experience the convenience of touch-free navigation. The camera detects gestures, which then trigger corresponding shortcuts.
You can download the pre-packaged .exe
file in releases, or run it through the source code.
You can also train your own gesture recognition model through the training branch or build your own gesture recognition model by implementing the abstract interface.
-
Run using pre-packaged executable file
Go to releases and download
gesture2shortcuts.exe
, and click to run.Note: Currently only the Windows version has been packaged, the Linux or Mac version needs to be built from the source code.
-
Build from source code
# download and install environment git clone https://github.com/LiRunJi/Hot-Key-With-Hands-Recognition.git cd Hot-Key-With-Hands-Recognition pip3 install -r requirements.py # run the app python3 main.py
-
Gesture recognition models
Commit changes to the
saved_ai_model
orsaved_ai_lite_model
directories. -
Gesture training scripts
Commit changes to the
training
branch. -
Software architecture and abstraction
Commit changes to the
abstract_layer.py
file. -
User interface
Commit changes to the
assets/styles
directory. -
Icons
Commit changes to the
assets/gestures_icons
directory or any other directory. -
Translations
Commit changes to the
assets/translations.yml
file.supported_languages: - en - zh-CN - zh-TW #add new supported language here such as -JP translations: Language: en: Language zh-CN: 语言 zh-TW: 語言 #then trannslate the key and add new key and value such as JP: 言語 Start: en: Start zh-CN: 开始 zh-TW: 開始 Stop: en: Stop zh-CN: 停止 zh-TW: 停止
-
Demo videos
Commit changes to the
README_DEMO_VIDEOS.md
file. -
Other content
Commit changes to any other directory.
-
Fork this project on GitHub.
-
Clone the forked repository locally using the following command:
git clone https://github.com/LiRunJi/Hot-Key-With-Hands-Recognition.git
-
Create a new branch and start developing on it. You can use the following command:
git checkout -b your_feature_branch
-
Test the changes locally and make sure the code quality is good. Ensure that your code meets the quality standards of the project, including coding style, variable naming conventions, and comment conventions, etc., as specified below.
-
Commit the changes to the forked repository using the following command:
git add . git commit -m "Your commit message" git push origin your_feature_branch
-
Create a pull request on GitHub. In the pull request, briefly describe the changes you made and provide relevant screenshots or examples.
-
Coding style: We recommend following the PEP 8 coding style guide.
-
Variable and function specifications: Use standard and common English to name variables as much as possible. It is best to use a naming style similar to that of this repository.
-
Commit message specifications:
We recommend the following format for commit messages:
Title The purpose of the commit and the changes made Author: your name Date: 0000-00-00
It is possible that the next update will be released at some point in the future through the following means.
-
Currently, the recognition of the signals sent is limited to triggering the corresponding shortcuts.
Before this software was released, a draft of using gestures to control lighting remotely was made, but the software structure of that draft was almost non-existent, and there were many security issues. Further ideas include building an IoT system and integrating it into this system.
-
It is expected to provide multiple sets of UI schemes in the .qss file, and you can choose the appropriate theme.
- Provide more comprehensive layering and abstraction.
- Consider using C++ or a lighter model framework to reduce the size after packaging.
Home | mediapipe (google.github.io)
PyCharm: the Python IDE for Professional Developers by JetBrains
GitHub - boppreh/keyboard: Hook and simulate global keyboard events on Windows and Linux.
Qt-Material — Qt Material documentation