Publication: Sign Translation with Myo Armbands
Issued Date
2018-08-21
Resource Type
Other identifier(s)
2-s2.0-85053474025
Rights
Mahidol University
Rights Holder(s)
SCOPUS
Bibliographic Citation
ICSEC 2017 - 21st International Computer Science and Engineering Conference 2017, Proceeding. (2018), 148-152
Suggested Citation
Malika Vachirapipop, Safra Soymat, Wasurat Tiraronnakul, Narit Hnoohom Sign Translation with Myo Armbands. ICSEC 2017 - 21st International Computer Science and Engineering Conference 2017, Proceeding. (2018), 148-152. doi:10.1109/ICSEC.2017.8443836 Retrieved from: https://repository.li.mahidol.ac.th/handle/20.500.14594/45590
Research Projects
Organizational Units
Authors
Journal Issue
Thesis
Title
Sign Translation with Myo Armbands
Other Contributor(s)
Abstract
© 2017 IEEE. Sign language is a common non-verbal communication method for people with impaired hearing. Despite the existence of sign language, there are communications boundaries. With the implementation of Myo armbands in predicting the gestures, this gap could be reduced. Myo armbands can collect three-axis accelerometer, three-axis gyroscope, and three-axis magnetometer signals. These collected data are used in constructing the prediction model. Translation is possible with the use of machine learning algorithms implemented in the prediction model. The algorithms used for testing the best accuracy rate from the prediction model are Decision Tree, Sequential Minimal Optimization, and Multilayer Perceptron. In addition, Mean and Standard Deviation (SD) are used to test for optimal feature selection. After testing, the prediction model that gave the best result were with MLP and SMO algorithms having Mean and SD as the optimal features. Thus, either of them could be implemented.