Skip to content

This repository is for constructing a robot that can detect and respond to sign language symbols and react in certain ways. It implements a basic neural network to detect gestures from a camera, in which the Raspberri-Pi robot would be able to move forwards and backwards based on the action commanded.

Notifications You must be signed in to change notification settings

SelfBriefs/Gesture_Dectection_Robot

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gesture Detection Autonomous Robot

This repository was used for a class at Monash University (FIT3146 - Maker Lab) for developing a small rover that would be able to move forwards and backwards through detecting various hand gestures. For example, a thumbs-up would make the rover move forwards, while a thumbs-down would make the rover move backwards. Additionally, the rover had a distance sensor and LEDs for indicating various actions. The rover code ran on a Raspberri-Pi 3 model and was able to stream a camera feed (via GStreamer) to another laptop to perform the detection on opencv. The laptop would then respond to the Pi with the current detected symbol and the rover would handle the rest of the electronic functionality.




This rover was made in four weeks and presented at the unit's final MakerLab expo, where it was showcased for everyone to see. The following links show the expo video and process video that was created alongside the robot.

Showcase Video

Process Video



Starting Up

Starting up the rover is simple. On the laptop, run the following two scripts:

. /src/cameras/gst_sink.sh
python3 /src/network/main.py opencv

Alternatively, to use a hard-coded terminal based control system instead of the neural network, run the following lines:

python3 /src/network/command.py

Similarly, to start up the scripts on the rover, the following two lines must be added:

. /src/cameras/gst_source.sh
python3 /src/controller/main.py

However, in the Raspberri-Pi's case, these two scripts are automatically started when the rover is turned on. One thing to ensure is correct is the IP addresses. These will need to be adjusted in the controller file, the network file and the command file if needed.

About

This repository is for constructing a robot that can detect and respond to sign language symbols and react in certain ways. It implements a basic neural network to detect gestures from a camera, in which the Raspberri-Pi robot would be able to move forwards and backwards based on the action commanded.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • PureBasic 35.3%
  • Python 34.0%
  • Jupyter Notebook 29.1%
  • Shell 1.6%