Hnoohom N.Chotivatunyu P.Mekruksavanich S.Jitpattanakul A.Mahidol University2023-06-202023-06-202022-01-01Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) Vol.13651 LNAI (2022) , 111-11903029743https://repository.li.mahidol.ac.th/handle/20.500.14594/87134Human activity recognition (HAR) remains a difficult challenge in human-computer interaction (HCI). The Internet of Healthcare Things (IoHT) and other technologies are expected to be used primarily in conjunction with HAR to support healthcare and elder care. In HAR research, lower limb movement recognition is a challenging research topic that can be applied to the daily care of the elderly, fragile, and disabled. Due to recent advances in deep learning, high-level autonomous feature extraction has become feasible, which is used to increase HAR efficiency. Deep learning approaches have also been used for sensor-based HAR in various domains. This study presents a novel method that uses convolutional neural networks (CNNs) with different kernel dimensions, referred to as multi-resolution CNNs, to detect high-level features at various resolutions. A publicly available benchmark dataset called HARTH was used to evaluate the recognition performance to collect acceleration data of the lower limb movements of 22 participants. The experimental results show that the proposed approach improves the F1 score and achieves a higher score of 94.76%.Computer ScienceMulti-resolution CNN for Lower Limb Movement Recognition Based on Wearable SensorsConference PaperSCOPUS10.1007/978-3-031-20992-5_102-s2.0-8514267449716113349