Ananta SrisuphabPiyanuch SilapachoteMahidol University2018-12-212019-03-142018-12-212019-03-142017-02-08IEEE Region 10 Annual International Conference, Proceedings/TENCON. (2017), 1-521593450215934422-s2.0-85015358528https://repository.li.mahidol.ac.th/handle/20.500.14594/42408© 2016 IEEE. Applications of gesture classification and recognition are ubiquitous, from automatic interpretations of sign languages for hearing impaired individuals to real-time communications, commands, and controls of machines in human computer interactions. The desire for maximally natural user experience and interactive user interface of these systems are generally accomplished by computationally expensive image processing techniques or time-based multi-stage action models. Wearable electronics embedded with advanced sensors are emerging alternatives. Their predefined gestural data, however, is quite limited and inaccurate. Improving upon both, we adopt a casually comfortable armband, utilizing its raw nine-axis inertial motion signals, and applying feedforward neural networks with backpropagation. Discriminatory features were effectively discovered in the frequency domain, employing Daubechies wavelet transforms. Evaluated on hand signals for construction workers, we achieved over 88% accuracy.Mahidol UniversityComputer ScienceEngineeringArtificial neural networks for gesture classification with inertial motion sensing armbandsConference PaperSCOPUS10.1109/TENCON.2016.7847946