Abstract:
This thesis explores the possibility of creating a portable, non-invasive, and robust sign language recognition system on a mobile device, capable of improving the quality of life of sign language users. A gesture recognition app was implemented for iOS in Objective-C and C++, using the OpenCV library for computer vision. This app was compared to the Thalmic Labs Myo, which has been successfully used as a sign language recognition platform. Under laboratory conditions, the mobile phone app reached a 96.83% overall classification rate, while the Myo reached a 94.44% classification rate. Despite this accuracy rate for the app, there are severe limitations that make the mobile platform unviable in its current state. However, its promising best-case results suggest that if those limitations can be resolved, then the mobile platform could be as effective as the Myo, and therefore an effective SLR tool.