ResNet-based Network for Recognizing Daily and Transitional Activities based on Smartphone Sensors
Issued Date
2022-01-01
Resource Type
Scopus ID
2-s2.0-85141642401
Journal Title
2022 3rd International Conference on Big Data Analytics and Practices, IBDAP 2022
Start Page
27
End Page
30
Rights Holder(s)
SCOPUS
Bibliographic Citation
2022 3rd International Conference on Big Data Analytics and Practices, IBDAP 2022 (2022) , 27-30
Suggested Citation
Mekruksavanich S., Jantawong P., Hnoohom N., Jitpattanakul A. ResNet-based Network for Recognizing Daily and Transitional Activities based on Smartphone Sensors. 2022 3rd International Conference on Big Data Analytics and Practices, IBDAP 2022 (2022) , 27-30. 30. doi:10.1109/IBDAP55587.2022.9907111 Retrieved from: https://repository.li.mahidol.ac.th/handle/20.500.14594/84346
Title
ResNet-based Network for Recognizing Daily and Transitional Activities based on Smartphone Sensors
Author's Affiliation
Other Contributor(s)
Abstract
In contemporary wearable computing contexts, sensor-based human activity recognition (HAR) has become a popular research topic. Investigators from the Health Applications Research Institute presented promising discoveries to promote healthcare applications, including fall detection, athletic tracking and reporting, and a monitoring scheme for senior activities in intelligent homes. In these services, ordinary and transitory human actions are captured by smartphones' wearable sensors and analyzed as fundamental and complicated motions. Deep learning techniques demonstrated the usefulness and effectiveness of convolutional neural networks (CNNs) in extracting high-level features embedded in sensor data to develop reliable recognition models. CNN faces deterioration of gradient vanishing issues when networks require deeper convolution layers. To overcome the problem, we developed ResNet, a deep residual network for determining daily and transitory activities. Using a significant standard HAR dataset called the KU-HAR dataset that gathered smartphone sensor data of various human actions, we performed experiments to identify the most appropriate ResNet-based models. Experimental findings indicate that the ResNet-18 has the highest accuracy, at 93.54%. The acquired results surpass prior state-of-the-art models by 3.87% in terms of accuracy.