A low-cost, intelligent prosthetic hand that replicates natural hand gestures using EMG signals and a lightweight machine learning model. This system enables users to open and close a prosthetic hand using muscle signals from their forearm, processed in real time.
This project combines:
- EMG signal acquisition
- Arduino-based motor control
- Raspberry Pi-based gesture prediction using a TFLite model
The final output is a responsive prosthetic hand capable of mimicking basic gestures like open and close using six servo motors.
EMG Sensor (V3 Electrodes)
β
βΌ
Arduino (Reads EMG analog values)
β
Serial Communication (USB)
βΌ
Raspberry Pi (Predicts gesture using ML)
β
Gesture command (Open/Close)
βΌ
Arduino (Controls 6 Servo Motors)
βΌ
Prosthetic Hand Movement
- EMG Sensor (Myoware or V3 electrodes)
- Arduino Uno
- Raspberry Pi 5
- 6x Servo Motors
- 3D-Printed or mechanical prosthetic hand frame
- Jumper wires, breadboard
- Arduino IDE
- Python 3
- TensorFlow Lite
- NumPy
- tflite-runtime
- Serial communication (pySerial)
prosthetic-hand-emg-ml/
βββ Arduino_Code/
β βββ prosthetic_hand.ino
βββ RaspberryPi_Code/
β βββ predict_gesture.py
β βββ model.tflite
βββ images/
β βββ circuit_diagram.png
β βββ workflow_chart.png
β βββ prosthetic_hand_photo.jpg
βββ report/
β βββ report_PDF.pdf
βββ README.md
βββ LICENSE
- Arduino reads EMG values from analog pin and sends them to the Raspberry Pi.
- Raspberry Pi preprocesses the signal into a 50-length input vector.
- TensorFlow Lite model predicts the gesture ("open" or "close").
- Gesture command is sent back to Arduino over serial.
- Arduino controls servo motors to perform the hand movement.
- Model Type: Sequential 1D CNN
- Input Shape: (1, 50, 1) EMG time-series
- Output: Binary (0 = Close, 1 = Open)
- Framework: TensorFlow Lite (for edge deployment)
- Real-time prediction latency: < 1 second
- Prediction accuracy: ~98% on test samples
- Smooth servo actuation for open/close gestures
Workflow Diagram
Final Setup