Skip to content

teja-787/Smart_Prosthetic_Hand

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

8 Commits
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ€– Smart Prosthetic Hand Controlled by EMG Signals and Machine Learning

A low-cost, intelligent prosthetic hand that replicates natural hand gestures using EMG signals and a lightweight machine learning model. This system enables users to open and close a prosthetic hand using muscle signals from their forearm, processed in real time.


πŸ“Œ Project Overview

This project combines:

  • EMG signal acquisition
  • Arduino-based motor control
  • Raspberry Pi-based gesture prediction using a TFLite model

The final output is a responsive prosthetic hand capable of mimicking basic gestures like open and close using six servo motors.


🧠 System Architecture

EMG Sensor (V3 Electrodes)
       β”‚
       β–Ό
Arduino (Reads EMG analog values)
       β”‚
Serial Communication (USB)
       β–Ό
Raspberry Pi (Predicts gesture using ML)
       β”‚
Gesture command (Open/Close)
       β–Ό
Arduino (Controls 6 Servo Motors)
       β–Ό
Prosthetic Hand Movement

🧰 Hardware Used

  • EMG Sensor (Myoware or V3 electrodes)
  • Arduino Uno
  • Raspberry Pi 5
  • 6x Servo Motors
  • 3D-Printed or mechanical prosthetic hand frame
  • Jumper wires, breadboard

πŸ› οΈ Software & Libraries

  • Arduino IDE
  • Python 3
  • TensorFlow Lite
  • NumPy
  • tflite-runtime
  • Serial communication (pySerial)

🧾 File Structure

prosthetic-hand-emg-ml/
β”œβ”€β”€ Arduino_Code/
β”‚   └── prosthetic_hand.ino
β”œβ”€β”€ RaspberryPi_Code/
β”‚   β”œβ”€β”€ predict_gesture.py
β”‚   └── model.tflite
β”œβ”€β”€ images/
β”‚   β”œβ”€β”€ circuit_diagram.png
β”‚   β”œβ”€β”€ workflow_chart.png
β”‚   └── prosthetic_hand_photo.jpg
β”œβ”€β”€ report/
β”‚   └── report_PDF.pdf
β”œβ”€β”€ README.md
└── LICENSE

πŸš€ How It Works

  1. Arduino reads EMG values from analog pin and sends them to the Raspberry Pi.
  2. Raspberry Pi preprocesses the signal into a 50-length input vector.
  3. TensorFlow Lite model predicts the gesture ("open" or "close").
  4. Gesture command is sent back to Arduino over serial.
  5. Arduino controls servo motors to perform the hand movement.

πŸ€– Machine Learning Model

  • Model Type: Sequential 1D CNN
  • Input Shape: (1, 50, 1) EMG time-series
  • Output: Binary (0 = Close, 1 = Open)
  • Framework: TensorFlow Lite (for edge deployment)

🎯 Results

  • Real-time prediction latency: < 1 second
  • Prediction accuracy: ~98% on test samples
  • Smooth servo actuation for open/close gestures

πŸ“Έ Images

Workflow Diagram

Final Setup


πŸ§ͺ Demo


About

Amrutha V,Snigdhatanu Acharya

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •