Deep Pyramidal Residual Network for Indoor-Outdoor Activity Recognition Based on Wearable Sensor
3
Issued Date
2023-01-01
Resource Type
ISSN
10798587
eISSN
2326005X
Scopus ID
2-s2.0-85172658897
Journal Title
Intelligent Automation and Soft Computing
Volume
37
Issue
3
Start Page
2669
End Page
2686
Rights Holder(s)
SCOPUS
Bibliographic Citation
Intelligent Automation and Soft Computing Vol.37 No.3 (2023) , 2669-2686
Suggested Citation
Mekruksavanich S., Hnoohom N., Jitpattanakul A. Deep Pyramidal Residual Network for Indoor-Outdoor Activity Recognition Based on Wearable Sensor. Intelligent Automation and Soft Computing Vol.37 No.3 (2023) , 2669-2686. 2686. doi:10.32604/iasc.2023.038549 Retrieved from: https://repository.li.mahidol.ac.th/handle/123456789/90342
Title
Deep Pyramidal Residual Network for Indoor-Outdoor Activity Recognition Based on Wearable Sensor
Author(s)
Author's Affiliation
Other Contributor(s)
Abstract
Recognition of human activity is one of the most exciting aspects of time-series classification, with substantial practical and theoretical implications. Recent evidence indicates that activity recognition from wearable sensors is an effective technique for tracking elderly adults and children in indoor and outdoor environments. Consequently, researchers have demonstrated considerable passion for developing cutting-edge deep learning systems capable of exploiting unprocessed sensor data from wearable devices and generating practical decision assistance in many contexts. This study provides a deep learning-based approach for recognizing indoor and outdoor movement utilizing an enhanced deep pyramidal residual model called Sen-PyramidNet and motion information from wearable sensors (accelerometer and gyroscope). The suggested technique develops a residual unit based on a deep pyramidal residual network and introduces the concept of a pyramidal residual unit to increase detection capability. The proposed deep learning-based model was assessed using the publicly available 19Nonsens dataset, which gathered motion signals from various indoor and outdoor activities, including practicing various body parts. The experimental findings demonstrate that the proposed approach can efficiently reuse characteristics and has achieved an identification accuracy of 96.37% for indoor and 97.25% for outdoor activity. Moreover, comparison experiments demonstrate that the SenPyramidNet surpasses other cutting-edge deep learning models in terms of accuracy and F1-score. Furthermore, this study explores the inf luence of several wearable sensors on indoor and outdoor action recognition ability.
