One of the fundamental factors for good communication is, without a doubt, proper pronunciation. Speaking clear and correct English is crucial for the listener to correctly understand and reply to us. Nowadays, there are multitudes of sources and techniques for improving pronunciation. One of them is mobile applications. The biggest advantage of mobile phones is their accessibility. This mobile app, that will run under iOS, has been made to help those who want to enhance their English speaking and pronunciation skills. The part that analyzes user pronunciations runs in accordance with Microsoft Azure’s Cognitive Speech Service system. Users, along with learning new words and/or sentences, may also see their inadequacies in pronunciation compared with the generated outputs. By that, they can benefit from the app the most. The application is coded with the Swift programming language.
- Swift Programming Language
- XIB files approach for User Interface (no Storyboard)
- TableView, CollectionView, SegmentedControl, Alerts
- Custom TabBarController, NavigationController
- Delegations and Protocols
- Expandable TableViewCell
- Using SPM and Cocoapods
- Pronunciation Assessment Service (Microsoft Azure Cognitive Speech)
- UIKit
- Firebase Authentication
- Firebase Firestore
- Firebase Storage
- Kingfisher
- MBCircularProgressBar
- DSWaveformImage
- MicrosoftCognitiveServicesSpeech-iOS
- MVC