Time Series Classification Using Deep Learning for HAR Based on Smart Wearable Sensors
Issued Date
2022-01-01
Resource Type
Scopus ID
2-s2.0-85149639761
Journal Title
ICSEC 2022 - International Computer Science and Engineering Conference 2022
Start Page
357
End Page
360
Rights Holder(s)
SCOPUS
Bibliographic Citation
ICSEC 2022 - International Computer Science and Engineering Conference 2022 (2022) , 357-360
Suggested Citation
Jantawong P., Hnoohom N., Jitpattanakul A., Mekruksavanich S. Time Series Classification Using Deep Learning for HAR Based on Smart Wearable Sensors. ICSEC 2022 - International Computer Science and Engineering Conference 2022 (2022) , 357-360. 360. doi:10.1109/ICSEC56337.2022.10049357 Retrieved from: https://repository.li.mahidol.ac.th/handle/20.500.14594/84306
Title
Time Series Classification Using Deep Learning for HAR Based on Smart Wearable Sensors
Author's Affiliation
Other Contributor(s)
Abstract
In the last decades, time series classification (TSC) has emerged as one of the most challenging issues in data mining, and extensive studies have been done on various methods, including algorithm-based and learning-based techniques. Sensor-based human activity recognition (HAR) is a TSC issue that has become one of the most sought-after fields among business and academia specialists because of the proliferation of smartphone technology and wearable movement sensors. Conventional approaches to feature extraction provide a significant challenge in feature selection. Deep learning is an efficient strategy in the HAR scientific field and has solved the issue of feature selection. Nevertheless, several obstacles remain to study topics, including classifier interpretation. This article integrates well-known deep learning methods, namely convolutional neural networks and RNN-based models. The new approach proved to be more effective than the existing state-of-the-art approach. We assessed our network on the multivariant time-series benchmark (UCI-HAR) and revealed that our model surpasses other models in terms of training time and overall accuracy.