Refined LSTM Network for Sensor-based Human Activity Recognition in Real World Scenario
Issued Date
2022-01-01
Resource Type
ISSN
23270586
eISSN
23270594
Scopus ID
2-s2.0-85141937048
Journal Title
Proceedings of the IEEE International Conference on Software Engineering and Service Sciences, ICSESS
Volume
2022-October
Start Page
256
End Page
259
Rights Holder(s)
SCOPUS
Bibliographic Citation
Proceedings of the IEEE International Conference on Software Engineering and Service Sciences, ICSESS Vol.2022-October (2022) , 256-259
Suggested Citation
Mekruksavanich S., Jantawong P., Hnoohom N., Jitpattanakul A. Refined LSTM Network for Sensor-based Human Activity Recognition in Real World Scenario. Proceedings of the IEEE International Conference on Software Engineering and Service Sciences, ICSESS Vol.2022-October (2022) , 256-259. 259. doi:10.1109/ICSESS54813.2022.9930218 Retrieved from: https://repository.li.mahidol.ac.th/handle/20.500.14594/84335
Title
Refined LSTM Network for Sensor-based Human Activity Recognition in Real World Scenario
Author's Affiliation
Other Contributor(s)
Abstract
Sensor-based identification of human actions is an essential field of study in ubiquitous computing. This aims to facilitate the assessment or understanding of current occurrences and their context based on sensor signals. Activity recognition is employed in surveillance systems, patient health monitoring, and many other systems involving the interaction between human and intelligent wearable devices, including smartphones and smartwatches. The primary objective of this study work is to identify human behavior in the actual world. We proposed an improved long short-term memory network called RLSTM that uses a squeeze-and-excitation module to efficiently identify human actions and enhance action identification systems' interpretation. A publicly available real-world dataset known as REALWORLD16 was used to train and validate the model five times to analyze the proposed network. The proposed RLSTM achieved the highest accuracy of 98.04% and F1-score of 97.76%, as determined by several investigations.