Arabic Sign Language gesture detection using mediapipe and openCV
This system extends Kazuhito00's Hand Gesture Recognition by adding support for:
-
The complete 28-letter Arabic alphabet
-
3 functional gestures (Space, Delete, Clear)
-
Real-time text output with Arabic script rendering
โ 28 Arabic Letters: Custom-trained gesture models for all Arabic characters
๐ Utility Gestures:
- ๐ Space: Insert space between words
- โ Delete: Remove last character
- ๐งน Clear: Reset entire text
๐ Arabic Text Rendering: Proper RTL display with glyph shaping
โก Adjustable Sensitivity: Control detection speed via frame threshold
-
MediaPipe โ Hand tracking
-
OpenCV โ Camera processing & visualization
-
NumPy โ Data handling
-
Model: CNN โ Static gesture classification
For letter with index 23 in "keypoint_classifier_label.csv":
- (shift + f) "number +"
- Add 20 next to "+" index 0 to 9: add 0, index 10 to 19: add 10
- Run app
- Press k to enter training mode
- Press 2 (for 20 + 2 = 22) Note: CSV uses 0-based indexing while labels start from 1
- Make the gesture 20+ times
- Close app
- Open "keypoint_classification_EN" in Jupyter notebook & run all cells
- Training done โ
Kazuhito00's Hand Gesture Recognition Repo: https://github.com/kinivi/hand-gesture-recognition-mediapipe
