Spectrum Autism Disorder Companion App
Explore the docs »
View Demo
·
Report Bug
·
Request Feature
- About the Project
- Getting Started
- Artificial Intelligence Integration
- Features and Functionality
- Concept Process
- Development Process
- Final Outcome
- Roadmap
- Contributing
- License
- Contact
- Acknowledgements
According to an article by mental health, autism spectrum disorder is “an umbrella term that covers everyone with conditions within the spectrum of autism”. The umbrella is a metaphor for shielding autistic people from over stimulation. Explain life would be a tool/companion mobile application which ASD people can use to communicate, learn, and express themselves.
The reason for choosing the mobile application name “Explain Life” for the duality of it’s meaning. In the one sense it explains life to autistic people such as social context and in the other it explains what autistic people understand to non-autistic people. It emphasises the importance on how autistic people are viewed and how this mobile application can aid the users (ASD people) to learn and improve on conversational, emotion expression, emotion detection etc. in their daily lives. Hence it being a sort of companion to the user.
Explain Life is an interactive AI-driven iOS mobile application for autism spectrum disorder people as a companion.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
Requires iOS 13 and Xcode 11
- In Xcode go to
File -> Swift Packages -> Add Package Dependency
- and paste in the repo's url:
https://github.com/dylandasilva1999/explain-life-ios-app
Open in Xcode 12 or later.
- Install Cocoapods
cd your/directory
pod install
Speech-to-text
technology converts spoken words into digital text on a screen. Speech-to-text was implemented and used in Explain Life, to record a sentence in a converstation and input and display the text in the app for analysis.
Text-to-speech
is a type of assistive technology that reads digital text aloud. Text-to-speech was used in Explain Life when a user wants to type how they feel, as well as for the emotion expression clicking on an emotion card.
The IBM Watson® Tone Analyzer
uses linguistic analysis to detect emotional and language tones in written text. An API call is made to IBM on the text that is inputted with the Speech-to-text
and then perform analysis on that text, which then displays the emotion and tone conveyed in the sentence.
Using the text-to-speech
AI the user has the ability to type in the textbox what they want Explain Life to say out loud, expression their emotions in difficult social interaction. The speak
features is to assist the user with communication, and minimise anxiety and stress in social interaction.
Using the speech-to-text
AI the user can record a sentence while in a social conversation/interaction. Thereafter making an API call to the IBM Tone Analyzer
services to analyse the text that was inputted through speech, and provide what emotion/tone was conveyed in the user's sentence. The record & IBM tone analysis
feature is used in Explain Life to assist autism spectrum disorder users to better understand emotions that other people are conveying in their sentences.
Using the text-to-speech
AI the user can click on any of the emotion cards, which Explain Life then reads out loud how the user is feeling.
Secure log in and register with email, password, and fullname with forget password functionality.
A full onboarding upon first time launch, which explains the core features of Explain Life.
Edit your username, and authentication email, with which you use to sign in to Explain Life.
In the settings, there are links to donate to Autism South Africa
, as well as a link to how to come in contact with Autism South Africa
. The user also has the ability to reset settings
which resets that the onboarding shows again and it signs the user out.
Firestore Database
for storing the user info, which include the email, and fullname.Firebase Authentication
for secure email & password log in (with forget password functionality).Cocoapods
for dependency management and adding additional frameworks and SDK's.ScrollView
,HStack, VStack
,ZStack
were used for creating layouts.IBMWatsonToneAnalyzerV3
was the pod used to be able to make use of the IBM Tone Analyzer.SwiftyJSON
was the pod/library that helps to read and process JSON data from an API/Server.Alamofire
was the pod used as an elegant and composable way to interface to HTTP network requests.@EnviromentObjects
,@State
,@StateObject
,@ObservedObject
for linking functions and files.
The Conceptual Process
is the set of actions, activities and research that was done when starting this project.
The Development Process
is the technical implementations and functionality done in the backend of the application.
MVC
design architecture used for structuring Explain Life.
- The biggest highlight for Explain Life was all the
research
that went into every single aspect of the application 🤩. - One major highlight was getting the
Firebase database and authentication
working. - Adding the ability to
forget and reset
your password 😅. - The
UI/UX design
that was based of a persona for the specific autism spectrum disorder audience. Custom Validation
on the sign in and register views.- Another major highlight was integrating all the
AI
functionality, and combine them as well 👏.
- One major challenge was understanding how to
integrate
three different types ofAI
. - There being a bug that the user does provide access to the microphone, but at random times it does not record.
- Working the JSON data that came from the
IBM Tone Analysis
service.
The Reviews & Testing
was done through a video demonstration, and a google form with questions related to the application.
Peer Reviews
were conducted by my fellow students and lecturer. The following feedback I found useful:
- "Yes, the use of light colours and white space makes the application seem very clear and non-distracting".
- "I think the design is made in such a way that its very easy to understand and take in upon entering the app. The largeness of the shapes just works really well with the whole "easy" part of the app".
- "Text-to-speech already works well, perhaps have a small animation play as the text is being read back.".
- "Maybe when the user is recording their speech, the "click to record" could change to "recording.." or something just to indicate even more clearly it is happening".
- "Love the use of emojis, will be relatable to the users. Maybe just use colours as well to display mood".
- "The application could remember your name and previous moods, and ask if you are feeling different from previously. It could also prompt you to explain why you are feeling a certain way".
- One simple but important future functionality would be to host the application on the
iOS Store
. - Adding the ability to
change the voice accent
to another person in the settings. - Creating and integrating a web dashboard for parents to view data tracked within the Explain Life mobile application.
To see a run through of the application, click below:
To see the promotiomal video, click below:
See the open issues for a list of proposed features (and known issues).
Contributions are what makes the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
- Dylan da Silva - DylandaSilva
Distributed under the MIT License. See LICENSE
for more information.\
- Dylan da Silva - @dylandasilva.designs - dylandasilva.b@gmail.com
- Project Link - https://github.com/dylandasilva1999/explain-life-ios-app