Gesture Recognition using Electromyographic, Spatial and Temporal Input Data

dc.contributor.advisorDougherty, John P.
dc.contributor.authorGabriel, Dorvil
dc.date.accessioned2016-07-19T18:11:28Z
dc.date.available2016-07-19T18:11:28Z
dc.date.issued2016
dc.description.abstractWith the advancement of technology, electromyographic (EMG) and spatial data recognition is having a growing impact on accessible computing. I designed a customizable script that uses a combination of EMG, spatial and temporal data, such that each user of the script can select a custom profile of gestures they want to use to type. The gestures chosen for each custom profile will be determined by each user’s level of ability/disability. Based off of the custom profiles each user selects and speed at which each user types we determine if using EMG, spatial data and temporal data can serve as viable form of text input. While this research placed a strong emphasis on text input, it also supports the ideal of universal design in other contexts. The Myo armband was used to read and interact with the EMG, spatial and temporal data. The interface of this script also revealed multiple techniques for scripting with the Myo that allows people with disabilities to use spatial data to type.
dc.description.sponsorshipHaverford College. Department of Computer Science
dc.identifier.urihttp://hdl.handle.net/10066/18676
dc.language.isoeng
dc.rights.accessOpen Access
dc.rights.urihttp://creativecommons.org/licenses/by-nc/4.0/
dc.titleGesture Recognition using Electromyographic, Spatial and Temporal Input Data
dc.typeThesis
Files
Original bundle
Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
2016GabrielD.pdf
Size:
2.35 MB
Format:
Adobe Portable Document Format
Description:
Thesis
Loading...
Thumbnail Image
Name:
2016GabrielD_release.pdf
Size:
314.06 KB
Format:
Adobe Portable Document Format
Description:
Archive Staff only
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description:
Collections