- Flex sensor based detection of signs and custom android app for text and audio output (using MIT app inventor)
- Employs fingerspelling, a new mode of communication employing 3 modes and entire device is operated with signs, including START and STOP a. 3 Modes each having 32 options to select b. Mode 1 - common phrases c. Mode 2 - alphabets d. Mode 3 - Numbers and special characters e. Start and Stop
- Custom created dataset of ADC values from 100 individuals - total 88000 set of values, each class 1000 values
- ML algorithms embedded to increase the accuracy from 79% to 91%
- Report submitted for Final year project
- Image of signs
- Dataset analysis and ML application on the dataset
- Code for quickly finding which ML models to use among many of the ML models using LAZY PREDICT library
- Data collection from arduino (ADC values)
- Sending the output to app
- Read data from arduino
- Flex sensor ADC value to decimal number convertor
- Main loop
- Dataset file for training and testing ML models
- Models file