Skip to content

The original MNIST(Modified National Institute of Standards and Technology) image dataset of handwritten digits is a popular benchmark for image-based machine learning methods. The American Sign Language letter database of hand gestures represent a multi-class problem with 24 classes of letters (excluding J and Z which require motion).

Notifications You must be signed in to change notification settings

vaibhavbichave/American-Sign-Language

Repository files navigation

American Sign Language

Table of Content

Introduction

American Sign Language (ASL) is a complete, complex language that employs signs made with the hands and other movements, including facial expressions and postures of the body. It is the first language of many deaf North Americans, and one of several communication options available to deaf people. ASL is said to be the fourth most commonly used language in the United States.

Motivation

Sign language is based on the idea that sight is the most useful tool a deaf person has to communicate and receive information. Thus, American Sign Language uses hand shape, position, and movement; body movements; gestures; facial expressions; and other visual cues to form its words.

Installation

The Code is written in Python 3.6.10. If you don't have Python installed you can find it here. If you are using a lower version of Python you can upgrade using the pip package, ensuring you have the latest version of pip. To install the required packages and libraries, run this command in the project directory after cloning the repository:

pip install -r requirements.txt

Directory Tree

├── images
│   ├── amer_sign2.png
|   ├── amer_sign3.png
|   ├── american_sign_language.PNG
|   ├── test1.jpeg
|   ├── test2.jpeg
├── input
│   ├── sign_mnist_test.csv
│   ├── sign_mnist_train.csv
├── model
│   ├── SVM_model
│   ├── forest_model
│   ├── kneighbors_model
│   ├── logistic_model
│   ├── naive_bayes_model
│   ├── tree_model
├── README.md
├── Sign Language MNIST.ipynb
├── Testing.ipynb
├── requirements.txt

Technologies Used

Result


ML Model Train Accuracy Test Accuracy
1 Support Vector Machine 1.000 0.854
2 K-Nearest Neighbors 0.999 0.815
3 Random Forest 1.000 0.805
4 Logistic Regression 0.999 0.685
5 Naive Bayes 0.797 0.630
6 Decision Tree 0.995 0.431

For Training Sample

Actual Label : D
image

For Testing Sample

Actual Label : C
Predicted Label : C
image

Actual Label : A
Predicted Label : A
image

Future Scope

  • Front-End
  • Use multiple Algorithms
  • We can develop a complete product that will help the speech and hearing impaired people, and thereby reduce the communication gap.

About

The original MNIST(Modified National Institute of Standards and Technology) image dataset of handwritten digits is a popular benchmark for image-based machine learning methods. The American Sign Language letter database of hand gestures represent a multi-class problem with 24 classes of letters (excluding J and Z which require motion).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published